Dec 05 23:19:38 crc systemd[1]: Starting Kubernetes Kubelet... Dec 05 23:19:38 crc restorecon[4681]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:38 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 23:19:39 crc restorecon[4681]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 23:19:39 crc restorecon[4681]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 05 23:19:39 crc kubenswrapper[4734]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 23:19:39 crc kubenswrapper[4734]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 05 23:19:39 crc kubenswrapper[4734]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 23:19:39 crc kubenswrapper[4734]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 23:19:39 crc kubenswrapper[4734]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 05 23:19:39 crc kubenswrapper[4734]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.434264 4734 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437118 4734 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437135 4734 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437140 4734 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437146 4734 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437151 4734 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437155 4734 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437159 4734 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437162 4734 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437166 4734 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437171 4734 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437175 4734 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437180 4734 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437185 4734 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437189 4734 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437194 4734 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437197 4734 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437208 4734 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437212 4734 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437216 4734 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437220 4734 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437224 4734 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437228 4734 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437231 4734 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437235 4734 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437239 4734 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437243 4734 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437247 4734 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437251 4734 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437254 4734 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437258 4734 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437262 4734 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437265 4734 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437269 4734 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437273 4734 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437276 4734 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437280 4734 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437284 4734 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437287 4734 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437291 4734 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437295 4734 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437298 4734 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437302 4734 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437308 4734 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437312 4734 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437317 4734 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437321 4734 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437326 4734 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437330 4734 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437334 4734 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437339 4734 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437343 4734 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437351 4734 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437355 4734 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437359 4734 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437363 4734 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437367 4734 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437371 4734 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437375 4734 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437380 4734 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437384 4734 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437388 4734 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437392 4734 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437396 4734 feature_gate.go:330] unrecognized feature gate: Example Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437400 4734 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437405 4734 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437409 4734 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437413 4734 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437416 4734 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437420 4734 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437424 4734 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.437427 4734 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437503 4734 flags.go:64] FLAG: --address="0.0.0.0" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437511 4734 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437532 4734 flags.go:64] FLAG: --anonymous-auth="true" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437538 4734 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437544 4734 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437549 4734 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437555 4734 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437561 4734 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437566 4734 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437570 4734 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437576 4734 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437580 4734 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437585 4734 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437589 4734 flags.go:64] FLAG: --cgroup-root="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437594 4734 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437599 4734 flags.go:64] FLAG: --client-ca-file="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437603 4734 flags.go:64] FLAG: --cloud-config="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437608 4734 flags.go:64] FLAG: --cloud-provider="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437612 4734 flags.go:64] FLAG: --cluster-dns="[]" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437617 4734 flags.go:64] FLAG: --cluster-domain="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437621 4734 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437625 4734 flags.go:64] FLAG: --config-dir="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437630 4734 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437634 4734 flags.go:64] FLAG: --container-log-max-files="5" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437640 4734 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437644 4734 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437648 4734 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437653 4734 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437657 4734 flags.go:64] FLAG: --contention-profiling="false" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437661 4734 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437665 4734 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437669 4734 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437673 4734 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437678 4734 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437682 4734 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437686 4734 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437690 4734 flags.go:64] FLAG: --enable-load-reader="false" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437695 4734 flags.go:64] FLAG: --enable-server="true" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437699 4734 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437881 4734 flags.go:64] FLAG: --event-burst="100" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.437887 4734 flags.go:64] FLAG: --event-qps="50" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.439320 4734 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.439352 4734 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.439365 4734 flags.go:64] FLAG: --eviction-hard="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.439380 4734 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.439396 4734 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.439412 4734 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.439429 4734 flags.go:64] FLAG: --eviction-soft="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.439443 4734 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.439469 4734 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.439483 4734 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.439521 4734 flags.go:64] FLAG: --experimental-mounter-path="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.439599 4734 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.439616 4734 flags.go:64] FLAG: --fail-swap-on="true" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.439631 4734 flags.go:64] FLAG: --feature-gates="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.439712 4734 flags.go:64] FLAG: --file-check-frequency="20s" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.439730 4734 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.439744 4734 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440236 4734 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440250 4734 flags.go:64] FLAG: --healthz-port="10248" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440263 4734 flags.go:64] FLAG: --help="false" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440276 4734 flags.go:64] FLAG: --hostname-override="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440286 4734 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440296 4734 flags.go:64] FLAG: --http-check-frequency="20s" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440306 4734 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440315 4734 flags.go:64] FLAG: --image-credential-provider-config="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440324 4734 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440334 4734 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440343 4734 flags.go:64] FLAG: --image-service-endpoint="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440352 4734 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440362 4734 flags.go:64] FLAG: --kube-api-burst="100" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440371 4734 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440384 4734 flags.go:64] FLAG: --kube-api-qps="50" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440393 4734 flags.go:64] FLAG: --kube-reserved="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440403 4734 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440412 4734 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440422 4734 flags.go:64] FLAG: --kubelet-cgroups="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440432 4734 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440441 4734 flags.go:64] FLAG: --lock-file="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440453 4734 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440462 4734 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440472 4734 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440515 4734 flags.go:64] FLAG: --log-json-split-stream="false" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440557 4734 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440571 4734 flags.go:64] FLAG: --log-text-split-stream="false" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440583 4734 flags.go:64] FLAG: --logging-format="text" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440595 4734 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440605 4734 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440614 4734 flags.go:64] FLAG: --manifest-url="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440624 4734 flags.go:64] FLAG: --manifest-url-header="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440638 4734 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440648 4734 flags.go:64] FLAG: --max-open-files="1000000" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440660 4734 flags.go:64] FLAG: --max-pods="110" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440670 4734 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440679 4734 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440688 4734 flags.go:64] FLAG: --memory-manager-policy="None" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440698 4734 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440707 4734 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440716 4734 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440726 4734 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440759 4734 flags.go:64] FLAG: --node-status-max-images="50" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440769 4734 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440778 4734 flags.go:64] FLAG: --oom-score-adj="-999" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440788 4734 flags.go:64] FLAG: --pod-cidr="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440796 4734 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440811 4734 flags.go:64] FLAG: --pod-manifest-path="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440820 4734 flags.go:64] FLAG: --pod-max-pids="-1" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440829 4734 flags.go:64] FLAG: --pods-per-core="0" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440838 4734 flags.go:64] FLAG: --port="10250" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440848 4734 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440857 4734 flags.go:64] FLAG: --provider-id="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440866 4734 flags.go:64] FLAG: --qos-reserved="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440875 4734 flags.go:64] FLAG: --read-only-port="10255" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440886 4734 flags.go:64] FLAG: --register-node="true" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440897 4734 flags.go:64] FLAG: --register-schedulable="true" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440906 4734 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440921 4734 flags.go:64] FLAG: --registry-burst="10" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440931 4734 flags.go:64] FLAG: --registry-qps="5" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440941 4734 flags.go:64] FLAG: --reserved-cpus="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440950 4734 flags.go:64] FLAG: --reserved-memory="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440961 4734 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440970 4734 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440980 4734 flags.go:64] FLAG: --rotate-certificates="false" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440990 4734 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.440999 4734 flags.go:64] FLAG: --runonce="false" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.441008 4734 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.441018 4734 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.441027 4734 flags.go:64] FLAG: --seccomp-default="false" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.441036 4734 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.441045 4734 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.441055 4734 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.441064 4734 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.441073 4734 flags.go:64] FLAG: --storage-driver-password="root" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.441083 4734 flags.go:64] FLAG: --storage-driver-secure="false" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.441092 4734 flags.go:64] FLAG: --storage-driver-table="stats" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.441102 4734 flags.go:64] FLAG: --storage-driver-user="root" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.441112 4734 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.441122 4734 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.441131 4734 flags.go:64] FLAG: --system-cgroups="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.441141 4734 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.441157 4734 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.441167 4734 flags.go:64] FLAG: --tls-cert-file="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.441176 4734 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.441191 4734 flags.go:64] FLAG: --tls-min-version="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.441200 4734 flags.go:64] FLAG: --tls-private-key-file="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.441209 4734 flags.go:64] FLAG: --topology-manager-policy="none" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.441218 4734 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.441227 4734 flags.go:64] FLAG: --topology-manager-scope="container" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.441236 4734 flags.go:64] FLAG: --v="2" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.441248 4734 flags.go:64] FLAG: --version="false" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.441260 4734 flags.go:64] FLAG: --vmodule="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.441271 4734 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.441280 4734 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441511 4734 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441559 4734 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441576 4734 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441588 4734 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441598 4734 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441608 4734 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441617 4734 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441626 4734 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441634 4734 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441643 4734 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441651 4734 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441659 4734 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441667 4734 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441675 4734 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441683 4734 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441691 4734 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441699 4734 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441707 4734 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441717 4734 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441728 4734 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441738 4734 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441746 4734 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441754 4734 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441764 4734 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441773 4734 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441782 4734 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441790 4734 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441800 4734 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441812 4734 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441821 4734 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441828 4734 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441836 4734 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441845 4734 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441853 4734 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441861 4734 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441870 4734 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441878 4734 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441886 4734 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441894 4734 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441902 4734 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441910 4734 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441917 4734 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441925 4734 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441934 4734 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441942 4734 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441950 4734 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441958 4734 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441966 4734 feature_gate.go:330] unrecognized feature gate: Example Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441974 4734 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441985 4734 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.441995 4734 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.442004 4734 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.442014 4734 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.442022 4734 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.442033 4734 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.442041 4734 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.442050 4734 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.442097 4734 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.442108 4734 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.442118 4734 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.442128 4734 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.442136 4734 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.442145 4734 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.442153 4734 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.442163 4734 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.442171 4734 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.442179 4734 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.442188 4734 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.442196 4734 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.442204 4734 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.442213 4734 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.442239 4734 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.453860 4734 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.453890 4734 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454011 4734 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454024 4734 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454033 4734 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454042 4734 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454088 4734 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454097 4734 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454106 4734 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454114 4734 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454122 4734 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454130 4734 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454138 4734 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454146 4734 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454154 4734 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454162 4734 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454170 4734 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454178 4734 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454186 4734 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454194 4734 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454202 4734 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454214 4734 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454226 4734 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454236 4734 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454247 4734 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454256 4734 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454264 4734 feature_gate.go:330] unrecognized feature gate: Example Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454272 4734 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454281 4734 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454292 4734 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454301 4734 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454309 4734 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454317 4734 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454328 4734 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454336 4734 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454344 4734 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454352 4734 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454361 4734 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454368 4734 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454377 4734 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454385 4734 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454393 4734 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454400 4734 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454409 4734 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454417 4734 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454425 4734 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454433 4734 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454444 4734 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454454 4734 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454465 4734 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454473 4734 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454482 4734 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454491 4734 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454499 4734 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454508 4734 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454516 4734 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454546 4734 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454554 4734 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454562 4734 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454570 4734 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454578 4734 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454586 4734 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454594 4734 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454602 4734 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454612 4734 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454622 4734 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454632 4734 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454640 4734 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454648 4734 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454658 4734 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454666 4734 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454674 4734 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454682 4734 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.454695 4734 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454923 4734 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454935 4734 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454944 4734 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454953 4734 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454963 4734 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454974 4734 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454983 4734 feature_gate.go:330] unrecognized feature gate: Example Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.454992 4734 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455001 4734 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455010 4734 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455018 4734 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455027 4734 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455035 4734 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455044 4734 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455051 4734 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455062 4734 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455072 4734 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455080 4734 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455089 4734 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455097 4734 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455106 4734 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455115 4734 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455123 4734 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455131 4734 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455140 4734 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455148 4734 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455156 4734 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455165 4734 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455173 4734 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455184 4734 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455196 4734 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455207 4734 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455215 4734 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455225 4734 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455233 4734 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455242 4734 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455251 4734 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455260 4734 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455268 4734 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455275 4734 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455284 4734 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455292 4734 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455301 4734 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455309 4734 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455316 4734 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455324 4734 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455332 4734 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455341 4734 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455348 4734 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455356 4734 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455365 4734 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455373 4734 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455380 4734 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455389 4734 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455396 4734 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455404 4734 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455412 4734 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455420 4734 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455429 4734 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455440 4734 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455449 4734 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455457 4734 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455466 4734 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455474 4734 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455482 4734 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455490 4734 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455498 4734 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455507 4734 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455516 4734 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455549 4734 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.455557 4734 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.455569 4734 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.455757 4734 server.go:940] "Client rotation is on, will bootstrap in background" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.460060 4734 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.460165 4734 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.460997 4734 server.go:997] "Starting client certificate rotation" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.461035 4734 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.461278 4734 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-27 10:00:10.107682544 +0000 UTC Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.461356 4734 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 514h40m30.646332888s for next certificate rotation Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.469072 4734 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.471469 4734 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.481151 4734 log.go:25] "Validated CRI v1 runtime API" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.504850 4734 log.go:25] "Validated CRI v1 image API" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.507108 4734 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.510788 4734 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-05-23-15-28-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.510842 4734 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.535254 4734 manager.go:217] Machine: {Timestamp:2025-12-05 23:19:39.53386972 +0000 UTC m=+0.217274006 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:33f74fdf-48ac-436c-92bc-f6724ef71400 BootID:bba22b9d-56b5-49db-9757-30928c54213a Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:09:fe:0a Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:09:fe:0a Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:f0:7e:02 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:73:67:09 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:3a:6e:22 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:b2:99:c4 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:aa:13:f0:ef:cf:fd Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:52:7e:ba:4b:92:e9 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.535547 4734 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.535779 4734 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.536179 4734 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.536390 4734 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.536437 4734 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.536739 4734 topology_manager.go:138] "Creating topology manager with none policy" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.536751 4734 container_manager_linux.go:303] "Creating device plugin manager" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.536949 4734 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.536984 4734 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.537392 4734 state_mem.go:36] "Initialized new in-memory state store" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.537499 4734 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.538246 4734 kubelet.go:418] "Attempting to sync node with API server" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.538271 4734 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.538297 4734 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.538316 4734 kubelet.go:324] "Adding apiserver pod source" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.538332 4734 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.540487 4734 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.541920 4734 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.542981 4734 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 05 23:19:39 crc kubenswrapper[4734]: E1205 23:19:39.543118 4734 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.543113 4734 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.543224 4734 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 05 23:19:39 crc kubenswrapper[4734]: E1205 23:19:39.543244 4734 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.544051 4734 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.544097 4734 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.544111 4734 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.544126 4734 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.544147 4734 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.544161 4734 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.544175 4734 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.544198 4734 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.544214 4734 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.544231 4734 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.544253 4734 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.544267 4734 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.544648 4734 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.545466 4734 server.go:1280] "Started kubelet" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.545759 4734 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.545914 4734 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.545933 4734 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.546966 4734 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 05 23:19:39 crc systemd[1]: Started Kubernetes Kubelet. Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.548558 4734 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.548622 4734 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.548999 4734 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 17:59:21.180753576 +0000 UTC Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.549175 4734 server.go:460] "Adding debug handlers to kubelet server" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.549870 4734 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.549902 4734 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.550306 4734 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 05 23:19:39 crc kubenswrapper[4734]: E1205 23:19:39.550370 4734 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.550332 4734 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 05 23:19:39 crc kubenswrapper[4734]: E1205 23:19:39.549876 4734 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 05 23:19:39 crc kubenswrapper[4734]: E1205 23:19:39.550939 4734 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="200ms" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.553356 4734 factory.go:55] Registering systemd factory Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.553404 4734 factory.go:221] Registration of the systemd container factory successfully Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.554040 4734 factory.go:153] Registering CRI-O factory Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.554085 4734 factory.go:221] Registration of the crio container factory successfully Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.554187 4734 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.554243 4734 factory.go:103] Registering Raw factory Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.554305 4734 manager.go:1196] Started watching for new ooms in manager Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.555380 4734 manager.go:319] Starting recovery of all containers Dec 05 23:19:39 crc kubenswrapper[4734]: E1205 23:19:39.554880 4734 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.38:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187e75079d9dc998 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 23:19:39.545397656 +0000 UTC m=+0.228801962,LastTimestamp:2025-12-05 23:19:39.545397656 +0000 UTC m=+0.228801962,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.570478 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.570574 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.570598 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.570617 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.570635 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.570655 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.570675 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.570693 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.570717 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.570734 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.570755 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.570774 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.570792 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.570813 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.570831 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.570851 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.570881 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.570899 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.570917 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.570936 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.570954 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.570972 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.570997 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571018 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571039 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571057 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571078 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571098 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571117 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571169 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571212 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571232 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571252 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571273 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571292 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571311 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571330 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571349 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571368 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571387 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571405 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571425 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571446 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571467 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571485 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571505 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571622 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571646 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571665 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571686 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571705 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571724 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571749 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571804 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571826 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571846 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571866 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571886 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571904 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571922 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571941 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571961 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571980 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.571999 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.572019 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.572036 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.572055 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573181 4734 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573234 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573268 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573288 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573311 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573329 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573348 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573368 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573386 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573403 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573420 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573441 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573459 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573479 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573497 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573517 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573567 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573587 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573605 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573626 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573644 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573662 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573679 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573696 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573715 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573732 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573809 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573835 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573854 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573872 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573889 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573906 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573925 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.573943 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574002 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574023 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574040 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574060 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574087 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574111 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574131 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574151 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574180 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574210 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574235 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574258 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574316 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574339 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574356 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574376 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574396 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574414 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574434 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574452 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574470 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574489 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574507 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574550 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574569 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574587 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574604 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574622 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574641 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574660 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574679 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574697 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574714 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574731 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574750 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574767 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574784 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574802 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574819 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574837 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574854 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574874 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574894 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574914 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574935 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574953 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574970 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.574989 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575007 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575026 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575044 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575061 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575079 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575097 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575116 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575133 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575151 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575170 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575188 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575205 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575224 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575242 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575260 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575281 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575300 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575319 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575337 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575355 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575377 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575394 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575411 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575429 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575448 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575466 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575488 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575507 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575674 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575701 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575721 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575739 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575756 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575775 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575792 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575809 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575827 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575847 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575868 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575887 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575912 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575937 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575958 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575977 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.575994 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.576015 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.576035 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.576056 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.576074 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.576092 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.576110 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.576127 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.576146 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.576166 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.576185 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.576202 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.576222 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.576241 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.576261 4734 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.576279 4734 reconstruct.go:97] "Volume reconstruction finished" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.576294 4734 reconciler.go:26] "Reconciler: start to sync state" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.585932 4734 manager.go:324] Recovery completed Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.607966 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.609074 4734 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.610396 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.610449 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.610463 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.612641 4734 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.612709 4734 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.612761 4734 kubelet.go:2335] "Starting kubelet main sync loop" Dec 05 23:19:39 crc kubenswrapper[4734]: E1205 23:19:39.612849 4734 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.613352 4734 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.613449 4734 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.613573 4734 state_mem.go:36] "Initialized new in-memory state store" Dec 05 23:19:39 crc kubenswrapper[4734]: W1205 23:19:39.613554 4734 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 05 23:19:39 crc kubenswrapper[4734]: E1205 23:19:39.614016 4734 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.625056 4734 policy_none.go:49] "None policy: Start" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.626335 4734 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.626366 4734 state_mem.go:35] "Initializing new in-memory state store" Dec 05 23:19:39 crc kubenswrapper[4734]: E1205 23:19:39.653302 4734 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.673692 4734 manager.go:334] "Starting Device Plugin manager" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.673999 4734 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.674077 4734 server.go:79] "Starting device plugin registration server" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.674810 4734 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.674927 4734 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.675245 4734 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.675404 4734 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.675472 4734 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 05 23:19:39 crc kubenswrapper[4734]: E1205 23:19:39.684165 4734 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.712982 4734 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.713135 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.714625 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.714712 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.714744 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.715126 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.715214 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.715254 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.716230 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.716278 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.716338 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.716387 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.716412 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.716428 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.716701 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.716794 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.716858 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.718227 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.718281 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.718300 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.718499 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.718674 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.718717 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.718816 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.718862 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.718878 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.719723 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.719767 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.719780 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.719785 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.719816 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.719833 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.720086 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.720229 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.720274 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.720994 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.721028 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.721044 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.721202 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.721240 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.721256 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.721294 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.721381 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.722236 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.722271 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.722284 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:39 crc kubenswrapper[4734]: E1205 23:19:39.753904 4734 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="400ms" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.775178 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.776767 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.776815 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.776829 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.776862 4734 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 23:19:39 crc kubenswrapper[4734]: E1205 23:19:39.777491 4734 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.38:6443: connect: connection refused" node="crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.777784 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.777883 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.777922 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.777961 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.778018 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.778064 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.778140 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.778212 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.778259 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.778312 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.778374 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.778400 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.778455 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.778476 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.778574 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.879989 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.880207 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.880270 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.880397 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.880406 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.880596 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.880618 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.880786 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.880848 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.881080 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.881137 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.881183 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.881226 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.881390 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.881393 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.881434 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.880644 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.880988 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.881487 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.881270 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.880997 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.881559 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.881606 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.881669 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.881692 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.881713 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.881783 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.881858 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.881906 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.881968 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.978718 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.981061 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.981136 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.981159 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:39 crc kubenswrapper[4734]: I1205 23:19:39.981203 4734 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 23:19:39 crc kubenswrapper[4734]: E1205 23:19:39.981955 4734 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.38:6443: connect: connection refused" node="crc" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.042330 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.053005 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.073574 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:19:40 crc kubenswrapper[4734]: W1205 23:19:40.078184 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-71f4f70fc14510566c4fb3f97d0b3c06fcc56e7a92ce2da51408a4d8b301938d WatchSource:0}: Error finding container 71f4f70fc14510566c4fb3f97d0b3c06fcc56e7a92ce2da51408a4d8b301938d: Status 404 returned error can't find the container with id 71f4f70fc14510566c4fb3f97d0b3c06fcc56e7a92ce2da51408a4d8b301938d Dec 05 23:19:40 crc kubenswrapper[4734]: W1205 23:19:40.079485 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-a0df18f60463cb3289fe283fa8c4f2ee0a1b57940cd5c2c71cad6fbcfd8b76c4 WatchSource:0}: Error finding container a0df18f60463cb3289fe283fa8c4f2ee0a1b57940cd5c2c71cad6fbcfd8b76c4: Status 404 returned error can't find the container with id a0df18f60463cb3289fe283fa8c4f2ee0a1b57940cd5c2c71cad6fbcfd8b76c4 Dec 05 23:19:40 crc kubenswrapper[4734]: W1205 23:19:40.088364 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-9fce69fbf0747f3deda345b746ca61680bc9aaf0111e180319bd07b1e68b9f9f WatchSource:0}: Error finding container 9fce69fbf0747f3deda345b746ca61680bc9aaf0111e180319bd07b1e68b9f9f: Status 404 returned error can't find the container with id 9fce69fbf0747f3deda345b746ca61680bc9aaf0111e180319bd07b1e68b9f9f Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.089997 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.096363 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 23:19:40 crc kubenswrapper[4734]: W1205 23:19:40.116368 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-e0e9c48a452626722de4dcfe8e39ea6386d6fcba3871f8886f145138969274bf WatchSource:0}: Error finding container e0e9c48a452626722de4dcfe8e39ea6386d6fcba3871f8886f145138969274bf: Status 404 returned error can't find the container with id e0e9c48a452626722de4dcfe8e39ea6386d6fcba3871f8886f145138969274bf Dec 05 23:19:40 crc kubenswrapper[4734]: W1205 23:19:40.117865 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-e21037cf0cc2d502d61d7c5f7c49e92aeda719d21e2956316ecd18bd6780dbab WatchSource:0}: Error finding container e21037cf0cc2d502d61d7c5f7c49e92aeda719d21e2956316ecd18bd6780dbab: Status 404 returned error can't find the container with id e21037cf0cc2d502d61d7c5f7c49e92aeda719d21e2956316ecd18bd6780dbab Dec 05 23:19:40 crc kubenswrapper[4734]: E1205 23:19:40.155355 4734 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="800ms" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.382658 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.384657 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.384709 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.384720 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.384755 4734 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 23:19:40 crc kubenswrapper[4734]: E1205 23:19:40.385379 4734 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.38:6443: connect: connection refused" node="crc" Dec 05 23:19:40 crc kubenswrapper[4734]: W1205 23:19:40.417378 4734 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 05 23:19:40 crc kubenswrapper[4734]: E1205 23:19:40.417464 4734 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Dec 05 23:19:40 crc kubenswrapper[4734]: W1205 23:19:40.524287 4734 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 05 23:19:40 crc kubenswrapper[4734]: E1205 23:19:40.524971 4734 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Dec 05 23:19:40 crc kubenswrapper[4734]: W1205 23:19:40.536019 4734 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 05 23:19:40 crc kubenswrapper[4734]: E1205 23:19:40.536093 4734 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.547192 4734 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.549334 4734 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 08:31:50.237778844 +0000 UTC Dec 05 23:19:40 crc kubenswrapper[4734]: W1205 23:19:40.563620 4734 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Dec 05 23:19:40 crc kubenswrapper[4734]: E1205 23:19:40.563674 4734 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.625362 4734 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="b67992e82d458b7597aa8360f205e69c467011160b77b9a95a6edac6b329a679" exitCode=0 Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.625488 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"b67992e82d458b7597aa8360f205e69c467011160b77b9a95a6edac6b329a679"} Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.625737 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"71f4f70fc14510566c4fb3f97d0b3c06fcc56e7a92ce2da51408a4d8b301938d"} Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.625925 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.628077 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.628138 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.628159 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.629453 4734 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="ff9c5cbf877fad8c2d4155cab3be27491de84cf4b7f3476f60a02de39936ab51" exitCode=0 Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.629504 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"ff9c5cbf877fad8c2d4155cab3be27491de84cf4b7f3476f60a02de39936ab51"} Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.629585 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e21037cf0cc2d502d61d7c5f7c49e92aeda719d21e2956316ecd18bd6780dbab"} Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.629780 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.631150 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.631189 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.631201 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.634183 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7"} Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.634231 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e0e9c48a452626722de4dcfe8e39ea6386d6fcba3871f8886f145138969274bf"} Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.636241 4734 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a" exitCode=0 Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.636311 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a"} Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.636341 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9fce69fbf0747f3deda345b746ca61680bc9aaf0111e180319bd07b1e68b9f9f"} Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.636459 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.637699 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.637732 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.637742 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.638187 4734 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="38c9b642a69ad35815bad41f9e51de53d3cd09b49769b0d1f5a329c0a6b9c879" exitCode=0 Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.638249 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"38c9b642a69ad35815bad41f9e51de53d3cd09b49769b0d1f5a329c0a6b9c879"} Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.638308 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a0df18f60463cb3289fe283fa8c4f2ee0a1b57940cd5c2c71cad6fbcfd8b76c4"} Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.638507 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.639900 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.640931 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.640974 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.640988 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.641325 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.641434 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:40 crc kubenswrapper[4734]: I1205 23:19:40.641514 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:40 crc kubenswrapper[4734]: E1205 23:19:40.958581 4734 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="1.6s" Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.186540 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.188372 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.188429 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.188441 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.188505 4734 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 23:19:41 crc kubenswrapper[4734]: E1205 23:19:41.189199 4734 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.38:6443: connect: connection refused" node="crc" Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.549408 4734 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 00:28:39.511872804 +0000 UTC Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.645482 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"d3839b99230a52ff4e7236bc43a6e597e798130b3db74b5e7ba955e2bdedb700"} Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.645690 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.646870 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.646906 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.646918 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.648661 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6be62799b986e89e6324a37ffed14cfc15d4fa6efec043e842534075da2b7547"} Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.648692 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b9676ebc0e731c50baebbab917a9dc814ceea006a370980021eaeb8bf822825b"} Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.648706 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"32d72232eb5162100a1a381e51548864fc732ff00fd26239351ec294328fc7fe"} Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.648780 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.649637 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.649690 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.649708 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.651710 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3"} Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.651747 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a"} Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.651760 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5"} Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.651768 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.653054 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.653100 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.653118 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.658412 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6"} Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.658448 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818"} Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.658462 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30"} Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.658478 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0"} Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.672824 4734 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="90893df2b4671caaddc27de750cf83f1078dcf02b17d97c11fdb5731aa9e36fd" exitCode=0 Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.672884 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"90893df2b4671caaddc27de750cf83f1078dcf02b17d97c11fdb5731aa9e36fd"} Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.673054 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.675909 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.676199 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:41 crc kubenswrapper[4734]: I1205 23:19:41.676256 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:42 crc kubenswrapper[4734]: I1205 23:19:42.550125 4734 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 12:39:17.740593641 +0000 UTC Dec 05 23:19:42 crc kubenswrapper[4734]: I1205 23:19:42.550197 4734 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 349h19m35.190402242s for next certificate rotation Dec 05 23:19:42 crc kubenswrapper[4734]: I1205 23:19:42.682499 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097"} Dec 05 23:19:42 crc kubenswrapper[4734]: I1205 23:19:42.682914 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:42 crc kubenswrapper[4734]: I1205 23:19:42.684413 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:42 crc kubenswrapper[4734]: I1205 23:19:42.684469 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:42 crc kubenswrapper[4734]: I1205 23:19:42.684487 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:42 crc kubenswrapper[4734]: I1205 23:19:42.691066 4734 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="792abd5f07f5d3a7fe44e968262e800ae1db038efa574fa67edf28897053b33c" exitCode=0 Dec 05 23:19:42 crc kubenswrapper[4734]: I1205 23:19:42.691190 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"792abd5f07f5d3a7fe44e968262e800ae1db038efa574fa67edf28897053b33c"} Dec 05 23:19:42 crc kubenswrapper[4734]: I1205 23:19:42.691237 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:42 crc kubenswrapper[4734]: I1205 23:19:42.691480 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:42 crc kubenswrapper[4734]: I1205 23:19:42.692273 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:42 crc kubenswrapper[4734]: I1205 23:19:42.692307 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:42 crc kubenswrapper[4734]: I1205 23:19:42.692318 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:42 crc kubenswrapper[4734]: I1205 23:19:42.692898 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:42 crc kubenswrapper[4734]: I1205 23:19:42.692964 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:42 crc kubenswrapper[4734]: I1205 23:19:42.692992 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:42 crc kubenswrapper[4734]: I1205 23:19:42.789587 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:42 crc kubenswrapper[4734]: I1205 23:19:42.791578 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:42 crc kubenswrapper[4734]: I1205 23:19:42.791794 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:42 crc kubenswrapper[4734]: I1205 23:19:42.791963 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:42 crc kubenswrapper[4734]: I1205 23:19:42.792107 4734 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 23:19:43 crc kubenswrapper[4734]: I1205 23:19:43.699640 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4d2de38ae1a22f7b60ca15ad68c7022cf8e1b0f4c38168ff4670f118916836a5"} Dec 05 23:19:43 crc kubenswrapper[4734]: I1205 23:19:43.699694 4734 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 23:19:43 crc kubenswrapper[4734]: I1205 23:19:43.699703 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"18e9232e96fadda1a12372d99c803d6bc96d8983037fffaef522168c4712cdee"} Dec 05 23:19:43 crc kubenswrapper[4734]: I1205 23:19:43.699723 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"eb871beafb02ab4cde9373f4e3d0a67b2d882aba3ce282bfebddbeb2ad040dbc"} Dec 05 23:19:43 crc kubenswrapper[4734]: I1205 23:19:43.699740 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:43 crc kubenswrapper[4734]: I1205 23:19:43.699740 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"71975d921923c8b1019461883c195fc8cc46b0893191c52a131902c18addfe06"} Dec 05 23:19:43 crc kubenswrapper[4734]: I1205 23:19:43.700762 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:43 crc kubenswrapper[4734]: I1205 23:19:43.700816 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:43 crc kubenswrapper[4734]: I1205 23:19:43.700831 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:44 crc kubenswrapper[4734]: I1205 23:19:44.707632 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9bf8f2cf70de4e9aaddf5433a7d21ed2d41be608847880aca207f103e0be5570"} Dec 05 23:19:44 crc kubenswrapper[4734]: I1205 23:19:44.707819 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:44 crc kubenswrapper[4734]: I1205 23:19:44.708866 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:44 crc kubenswrapper[4734]: I1205 23:19:44.708902 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:44 crc kubenswrapper[4734]: I1205 23:19:44.708917 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:44 crc kubenswrapper[4734]: I1205 23:19:44.986075 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 05 23:19:45 crc kubenswrapper[4734]: I1205 23:19:45.710796 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:45 crc kubenswrapper[4734]: I1205 23:19:45.712418 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:45 crc kubenswrapper[4734]: I1205 23:19:45.712485 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:45 crc kubenswrapper[4734]: I1205 23:19:45.712517 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:45 crc kubenswrapper[4734]: I1205 23:19:45.742027 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:19:45 crc kubenswrapper[4734]: I1205 23:19:45.742241 4734 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 23:19:45 crc kubenswrapper[4734]: I1205 23:19:45.742296 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:45 crc kubenswrapper[4734]: I1205 23:19:45.743965 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:45 crc kubenswrapper[4734]: I1205 23:19:45.744019 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:45 crc kubenswrapper[4734]: I1205 23:19:45.744030 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:46 crc kubenswrapper[4734]: I1205 23:19:46.145586 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:19:46 crc kubenswrapper[4734]: I1205 23:19:46.715249 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:46 crc kubenswrapper[4734]: I1205 23:19:46.715249 4734 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 23:19:46 crc kubenswrapper[4734]: I1205 23:19:46.715503 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:46 crc kubenswrapper[4734]: I1205 23:19:46.717143 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:46 crc kubenswrapper[4734]: I1205 23:19:46.717216 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:46 crc kubenswrapper[4734]: I1205 23:19:46.717235 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:46 crc kubenswrapper[4734]: I1205 23:19:46.717717 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:46 crc kubenswrapper[4734]: I1205 23:19:46.717786 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:46 crc kubenswrapper[4734]: I1205 23:19:46.717812 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:46 crc kubenswrapper[4734]: I1205 23:19:46.959384 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 23:19:46 crc kubenswrapper[4734]: I1205 23:19:46.959776 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:46 crc kubenswrapper[4734]: I1205 23:19:46.962250 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:46 crc kubenswrapper[4734]: I1205 23:19:46.962319 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:46 crc kubenswrapper[4734]: I1205 23:19:46.962349 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:46 crc kubenswrapper[4734]: I1205 23:19:46.967893 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 23:19:47 crc kubenswrapper[4734]: I1205 23:19:47.059044 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:19:47 crc kubenswrapper[4734]: I1205 23:19:47.247151 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 23:19:47 crc kubenswrapper[4734]: I1205 23:19:47.247392 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:47 crc kubenswrapper[4734]: I1205 23:19:47.249114 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:47 crc kubenswrapper[4734]: I1205 23:19:47.249158 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:47 crc kubenswrapper[4734]: I1205 23:19:47.249191 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:47 crc kubenswrapper[4734]: I1205 23:19:47.554017 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 23:19:47 crc kubenswrapper[4734]: I1205 23:19:47.718501 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:47 crc kubenswrapper[4734]: I1205 23:19:47.718661 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:47 crc kubenswrapper[4734]: I1205 23:19:47.719963 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:47 crc kubenswrapper[4734]: I1205 23:19:47.720020 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:47 crc kubenswrapper[4734]: I1205 23:19:47.720037 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:47 crc kubenswrapper[4734]: I1205 23:19:47.720826 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:47 crc kubenswrapper[4734]: I1205 23:19:47.720899 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:47 crc kubenswrapper[4734]: I1205 23:19:47.720927 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:47 crc kubenswrapper[4734]: I1205 23:19:47.957840 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 05 23:19:47 crc kubenswrapper[4734]: I1205 23:19:47.958119 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:47 crc kubenswrapper[4734]: I1205 23:19:47.959776 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:47 crc kubenswrapper[4734]: I1205 23:19:47.959829 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:47 crc kubenswrapper[4734]: I1205 23:19:47.959851 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:48 crc kubenswrapper[4734]: I1205 23:19:48.160651 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 23:19:48 crc kubenswrapper[4734]: I1205 23:19:48.721457 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:48 crc kubenswrapper[4734]: I1205 23:19:48.722862 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:48 crc kubenswrapper[4734]: I1205 23:19:48.722914 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:48 crc kubenswrapper[4734]: I1205 23:19:48.722933 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:49 crc kubenswrapper[4734]: E1205 23:19:49.685061 4734 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 05 23:19:49 crc kubenswrapper[4734]: I1205 23:19:49.723602 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:49 crc kubenswrapper[4734]: I1205 23:19:49.725015 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:49 crc kubenswrapper[4734]: I1205 23:19:49.725099 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:49 crc kubenswrapper[4734]: I1205 23:19:49.725125 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:50 crc kubenswrapper[4734]: I1205 23:19:50.176685 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 23:19:50 crc kubenswrapper[4734]: I1205 23:19:50.554154 4734 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 23:19:50 crc kubenswrapper[4734]: I1205 23:19:50.554253 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 23:19:50 crc kubenswrapper[4734]: I1205 23:19:50.726340 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:50 crc kubenswrapper[4734]: I1205 23:19:50.727761 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:50 crc kubenswrapper[4734]: I1205 23:19:50.727839 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:50 crc kubenswrapper[4734]: I1205 23:19:50.727853 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:50 crc kubenswrapper[4734]: I1205 23:19:50.733866 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 23:19:51 crc kubenswrapper[4734]: I1205 23:19:51.547359 4734 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 05 23:19:51 crc kubenswrapper[4734]: I1205 23:19:51.729093 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:51 crc kubenswrapper[4734]: I1205 23:19:51.730265 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:51 crc kubenswrapper[4734]: I1205 23:19:51.730318 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:51 crc kubenswrapper[4734]: I1205 23:19:51.730335 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:52 crc kubenswrapper[4734]: W1205 23:19:52.486208 4734 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 05 23:19:52 crc kubenswrapper[4734]: I1205 23:19:52.486339 4734 trace.go:236] Trace[1587382210]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 23:19:42.484) (total time: 10001ms): Dec 05 23:19:52 crc kubenswrapper[4734]: Trace[1587382210]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (23:19:52.486) Dec 05 23:19:52 crc kubenswrapper[4734]: Trace[1587382210]: [10.001908398s] [10.001908398s] END Dec 05 23:19:52 crc kubenswrapper[4734]: E1205 23:19:52.486373 4734 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 05 23:19:52 crc kubenswrapper[4734]: E1205 23:19:52.560666 4734 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 05 23:19:52 crc kubenswrapper[4734]: W1205 23:19:52.611575 4734 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 05 23:19:52 crc kubenswrapper[4734]: I1205 23:19:52.611712 4734 trace.go:236] Trace[421522941]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 23:19:42.609) (total time: 10001ms): Dec 05 23:19:52 crc kubenswrapper[4734]: Trace[421522941]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (23:19:52.611) Dec 05 23:19:52 crc kubenswrapper[4734]: Trace[421522941]: [10.001926499s] [10.001926499s] END Dec 05 23:19:52 crc kubenswrapper[4734]: E1205 23:19:52.611752 4734 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 05 23:19:52 crc kubenswrapper[4734]: E1205 23:19:52.793460 4734 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 05 23:19:53 crc kubenswrapper[4734]: I1205 23:19:53.156971 4734 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 05 23:19:53 crc kubenswrapper[4734]: I1205 23:19:53.157047 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 05 23:19:53 crc kubenswrapper[4734]: I1205 23:19:53.166242 4734 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 05 23:19:53 crc kubenswrapper[4734]: I1205 23:19:53.166304 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 05 23:19:55 crc kubenswrapper[4734]: I1205 23:19:55.022111 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 05 23:19:55 crc kubenswrapper[4734]: I1205 23:19:55.022392 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:55 crc kubenswrapper[4734]: I1205 23:19:55.024202 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:55 crc kubenswrapper[4734]: I1205 23:19:55.024277 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:55 crc kubenswrapper[4734]: I1205 23:19:55.024296 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:55 crc kubenswrapper[4734]: I1205 23:19:55.040008 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 05 23:19:55 crc kubenswrapper[4734]: I1205 23:19:55.742178 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:55 crc kubenswrapper[4734]: I1205 23:19:55.743930 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:55 crc kubenswrapper[4734]: I1205 23:19:55.744036 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:55 crc kubenswrapper[4734]: I1205 23:19:55.744056 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:55 crc kubenswrapper[4734]: I1205 23:19:55.752777 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:19:55 crc kubenswrapper[4734]: I1205 23:19:55.753069 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:55 crc kubenswrapper[4734]: I1205 23:19:55.754996 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:55 crc kubenswrapper[4734]: I1205 23:19:55.755103 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:55 crc kubenswrapper[4734]: I1205 23:19:55.755134 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:55 crc kubenswrapper[4734]: I1205 23:19:55.761164 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:19:55 crc kubenswrapper[4734]: I1205 23:19:55.994242 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:19:55 crc kubenswrapper[4734]: I1205 23:19:55.997254 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:19:55 crc kubenswrapper[4734]: I1205 23:19:55.997324 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:19:55 crc kubenswrapper[4734]: I1205 23:19:55.997349 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:19:55 crc kubenswrapper[4734]: I1205 23:19:55.997400 4734 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 23:19:56 crc kubenswrapper[4734]: E1205 23:19:56.002084 4734 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 05 23:19:56 crc kubenswrapper[4734]: I1205 23:19:56.208425 4734 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 05 23:19:56 crc kubenswrapper[4734]: I1205 23:19:56.548877 4734 apiserver.go:52] "Watching apiserver" Dec 05 23:19:56 crc kubenswrapper[4734]: I1205 23:19:56.553925 4734 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 05 23:19:56 crc kubenswrapper[4734]: I1205 23:19:56.554301 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Dec 05 23:19:56 crc kubenswrapper[4734]: I1205 23:19:56.554773 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 23:19:56 crc kubenswrapper[4734]: I1205 23:19:56.554907 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:19:56 crc kubenswrapper[4734]: E1205 23:19:56.555221 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:19:56 crc kubenswrapper[4734]: I1205 23:19:56.555495 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:19:56 crc kubenswrapper[4734]: I1205 23:19:56.555632 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 23:19:56 crc kubenswrapper[4734]: I1205 23:19:56.555721 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:19:56 crc kubenswrapper[4734]: I1205 23:19:56.555763 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 23:19:56 crc kubenswrapper[4734]: E1205 23:19:56.555784 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:19:56 crc kubenswrapper[4734]: E1205 23:19:56.555784 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:19:56 crc kubenswrapper[4734]: I1205 23:19:56.558843 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 05 23:19:56 crc kubenswrapper[4734]: I1205 23:19:56.559948 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 05 23:19:56 crc kubenswrapper[4734]: I1205 23:19:56.559956 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 05 23:19:56 crc kubenswrapper[4734]: I1205 23:19:56.561339 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 05 23:19:56 crc kubenswrapper[4734]: I1205 23:19:56.561345 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 05 23:19:56 crc kubenswrapper[4734]: I1205 23:19:56.561400 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 05 23:19:56 crc kubenswrapper[4734]: I1205 23:19:56.561402 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 05 23:19:56 crc kubenswrapper[4734]: I1205 23:19:56.561401 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 05 23:19:56 crc kubenswrapper[4734]: I1205 23:19:56.561464 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 05 23:19:56 crc kubenswrapper[4734]: I1205 23:19:56.609342 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 23:19:56 crc kubenswrapper[4734]: I1205 23:19:56.636683 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 23:19:56 crc kubenswrapper[4734]: I1205 23:19:56.650405 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 23:19:56 crc kubenswrapper[4734]: I1205 23:19:56.653468 4734 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 05 23:19:56 crc kubenswrapper[4734]: I1205 23:19:56.670204 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 23:19:56 crc kubenswrapper[4734]: I1205 23:19:56.683421 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 23:19:56 crc kubenswrapper[4734]: I1205 23:19:56.695178 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 23:19:56 crc kubenswrapper[4734]: I1205 23:19:56.706771 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 23:19:56 crc kubenswrapper[4734]: I1205 23:19:56.767104 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 23:19:57 crc kubenswrapper[4734]: I1205 23:19:57.106430 4734 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 05 23:19:57 crc kubenswrapper[4734]: I1205 23:19:57.748103 4734 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.156100 4734 trace.go:236] Trace[1306609689]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 23:19:43.538) (total time: 14617ms): Dec 05 23:19:58 crc kubenswrapper[4734]: Trace[1306609689]: ---"Objects listed" error: 14617ms (23:19:58.155) Dec 05 23:19:58 crc kubenswrapper[4734]: Trace[1306609689]: [14.617095977s] [14.617095977s] END Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.156150 4734 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.157579 4734 trace.go:236] Trace[316044594]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 23:19:43.640) (total time: 14516ms): Dec 05 23:19:58 crc kubenswrapper[4734]: Trace[316044594]: ---"Objects listed" error: 14516ms (23:19:58.157) Dec 05 23:19:58 crc kubenswrapper[4734]: Trace[316044594]: [14.516557386s] [14.516557386s] END Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.157617 4734 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.162591 4734 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.224127 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.241445 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.253809 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.255870 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.260656 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.263406 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.263450 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.263484 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.263504 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.263558 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.263580 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.263605 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.263623 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.263646 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.263663 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.263685 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.263718 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.263735 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.263752 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.263772 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.263791 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.263812 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.263845 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.263875 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.263891 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.263909 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.263926 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.263947 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.263963 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.263985 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264004 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264020 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264035 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264052 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264089 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264107 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264127 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264146 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264162 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264181 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264196 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264214 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264232 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264247 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264263 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264284 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264314 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264331 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264352 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264368 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264386 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264403 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264420 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264440 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264456 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264472 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264487 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264504 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264545 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264564 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264580 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264598 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264614 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264632 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264651 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264074 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264593 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.266545 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.266513 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.266550 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.267924 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.267973 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.268264 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.268434 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.268481 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.268656 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.268708 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.268973 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.269013 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.269262 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.269405 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.269458 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.269840 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.270453 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.271297 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.271609 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.264670 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274134 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274182 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274208 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274229 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274245 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274264 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274282 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274300 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274328 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274347 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274368 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274388 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274406 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274427 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274447 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274466 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274451 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274483 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274506 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274440 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274546 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274623 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274683 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274710 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274732 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274754 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274776 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274808 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274832 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274856 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274880 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274917 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274937 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274958 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.274978 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275002 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275020 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275053 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275074 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275072 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275093 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275114 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275134 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275150 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275173 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275192 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275213 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275236 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275262 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275284 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275304 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275328 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275352 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275372 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275395 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275417 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275441 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275461 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275481 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275556 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275579 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275598 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275619 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275638 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275661 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.271351 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275679 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277242 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277268 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277290 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277313 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277337 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277360 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277380 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277402 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277422 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277447 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277465 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277484 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277507 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277540 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277567 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277588 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277609 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277630 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277648 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277667 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277685 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277706 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277727 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277744 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277765 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277787 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277808 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277827 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277847 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277866 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277885 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277905 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277923 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277940 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277963 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278006 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278037 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278060 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278080 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278097 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278118 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278139 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278158 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278175 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278194 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278213 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278234 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278256 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278276 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278294 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278317 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278350 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278370 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278389 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278407 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278426 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278502 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278547 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278567 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278587 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278608 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278626 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275300 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275207 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275384 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275597 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275722 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275844 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278938 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.279199 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.280112 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.280417 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.276047 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.276203 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.276241 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.276418 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.276426 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.276421 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277103 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277175 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277197 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277226 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277301 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277463 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277460 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277561 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.277646 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278098 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278121 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278420 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.280634 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.280725 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.280807 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.281159 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.281148 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.281169 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.281197 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.281423 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.281755 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.281808 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.281836 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.281960 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.282382 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.282401 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.282467 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.282485 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.282756 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.275848 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.282771 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.282878 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.283095 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.283235 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.283410 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.283314 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.283825 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.283976 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.284181 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.284243 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.284310 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.284467 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.284477 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.284677 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.284699 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.284952 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.285063 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.285084 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.285304 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.285360 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.285701 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.285756 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.285813 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.285915 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.286071 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.286881 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.287013 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.287057 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.287141 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.287642 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.287544 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.287894 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.287924 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.288055 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.288106 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.288318 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.288409 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.288428 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.288441 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.288507 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.288511 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.288738 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.288991 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.289008 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.289051 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.289176 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.289343 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.289568 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.289389 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.289590 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.289390 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.289733 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.289803 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.289814 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.290078 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.290082 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.290438 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.290830 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.290954 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.291232 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.291774 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.291887 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.291922 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.291926 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.291943 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292012 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.278645 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292145 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292224 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292288 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292300 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292318 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292327 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292336 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292343 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292355 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292380 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292384 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292447 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292470 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292492 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292514 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292594 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292624 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292648 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292693 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292715 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292740 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292761 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292768 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292769 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292772 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292803 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292910 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.293126 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.293132 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.293169 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.293244 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.293248 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.293280 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.293492 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.293518 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.293608 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.293811 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.292787 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.293942 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.293889 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294005 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294057 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294066 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294084 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294097 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294126 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294126 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.294158 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:19:58.794138102 +0000 UTC m=+19.477542378 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294195 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294473 4734 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294489 4734 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294501 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294501 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294513 4734 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294542 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294553 4734 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294562 4734 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294571 4734 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294585 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294597 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294607 4734 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294620 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294630 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294639 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294649 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294660 4734 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294671 4734 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294680 4734 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294689 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294736 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294746 4734 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294756 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294765 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294775 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294785 4734 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294796 4734 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294805 4734 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294815 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294824 4734 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294835 4734 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294846 4734 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294871 4734 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294883 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294895 4734 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294907 4734 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294919 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294930 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294939 4734 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294949 4734 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294958 4734 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294967 4734 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294977 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294987 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294567 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294996 4734 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295072 4734 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295076 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295088 4734 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294511 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295110 4734 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295122 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295134 4734 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295148 4734 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295163 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295177 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295191 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295206 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295219 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295235 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295249 4734 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295274 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295287 4734 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295300 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295313 4734 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295326 4734 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295338 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295350 4734 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295361 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295372 4734 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295372 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295381 4734 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295390 4734 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295402 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295413 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295423 4734 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295432 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295441 4734 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295451 4734 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295465 4734 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295474 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295486 4734 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295495 4734 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295505 4734 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295515 4734 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295547 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295563 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295577 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295844 4734 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295856 4734 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295867 4734 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295876 4734 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295886 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295898 4734 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295908 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295918 4734 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295927 4734 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295937 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295947 4734 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295957 4734 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295967 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295977 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295986 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295996 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296009 4734 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296021 4734 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296033 4734 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296052 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296062 4734 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296074 4734 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296084 4734 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296094 4734 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296106 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296117 4734 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296126 4734 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296136 4734 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296146 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296155 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296168 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296786 4734 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296803 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296813 4734 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296825 4734 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296835 4734 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296846 4734 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296857 4734 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296869 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296880 4734 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296892 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296906 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296921 4734 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296931 4734 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296941 4734 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296955 4734 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296966 4734 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296977 4734 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296988 4734 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297000 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297011 4734 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297023 4734 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297035 4734 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297047 4734 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297060 4734 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297073 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297084 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297095 4734 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297105 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297116 4734 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297128 4734 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297138 4734 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297147 4734 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297157 4734 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297166 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297177 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297187 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297197 4734 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297206 4734 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297217 4734 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297226 4734 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297236 4734 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297245 4734 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297255 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297265 4734 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297273 4734 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297283 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.297292 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295214 4734 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295357 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.295440 4734 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.303031 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 23:19:58.802993711 +0000 UTC m=+19.486397987 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.294773 4734 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.303095 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 23:19:58.803087703 +0000 UTC m=+19.486491979 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294979 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295624 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.295817 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.294733 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296125 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296134 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296159 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.296398 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.299899 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.301673 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.301958 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.302276 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.302792 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.302826 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.302840 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.306847 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.307275 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.307573 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.309646 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.311265 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.311300 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.316020 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.316518 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.316568 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.316585 4734 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.316666 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 23:19:58.816642105 +0000 UTC m=+19.500046381 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.316832 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.322405 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.323119 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.324751 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.325724 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.325810 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.325882 4734 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.325993 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 23:19:58.825969696 +0000 UTC m=+19.509373972 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.326185 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.326723 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.327311 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.329316 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.330715 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.331372 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.331956 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.332043 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.342464 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.347357 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.352295 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.359442 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.363000 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.366487 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.373766 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.384189 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.398414 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.398691 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.398745 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.398826 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.398849 4734 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.398863 4734 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.398911 4734 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.398929 4734 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.398941 4734 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.398954 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.398896 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.399016 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.398998 4734 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.399090 4734 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.399103 4734 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.399115 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.399127 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.399136 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.399146 4734 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.399156 4734 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.399167 4734 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.399178 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.399189 4734 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.399201 4734 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.399212 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.399222 4734 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.399233 4734 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.399244 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.399254 4734 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.399263 4734 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.399272 4734 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.399281 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.399291 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.399299 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.399309 4734 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.399318 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.399327 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.399337 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.453471 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.482690 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.494790 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.504933 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.520080 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.613278 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.613309 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.613317 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.613416 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.613478 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.613665 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.681864 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.691822 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 23:19:58 crc kubenswrapper[4734]: W1205 23:19:58.751858 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-6a9312f379a1398721f41af4e63b4ce28a465a4ebe257be69a6c0f07589d1cc6 WatchSource:0}: Error finding container 6a9312f379a1398721f41af4e63b4ce28a465a4ebe257be69a6c0f07589d1cc6: Status 404 returned error can't find the container with id 6a9312f379a1398721f41af4e63b4ce28a465a4ebe257be69a6c0f07589d1cc6 Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.754986 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a85b6b105030d13772d4c780316d548ac8b4590b091eca7d70cfde1b16bc5c4a"} Dec 05 23:19:58 crc kubenswrapper[4734]: W1205 23:19:58.755397 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-e784c5666c30414d7b774d9d2260403e35cff200d7b2f6321316628aedd09054 WatchSource:0}: Error finding container e784c5666c30414d7b774d9d2260403e35cff200d7b2f6321316628aedd09054: Status 404 returned error can't find the container with id e784c5666c30414d7b774d9d2260403e35cff200d7b2f6321316628aedd09054 Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.768635 4734 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.803128 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.803261 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.803327 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.803455 4734 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.803498 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:19:59.803460287 +0000 UTC m=+20.486864563 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.803568 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 23:19:59.803544629 +0000 UTC m=+20.486948905 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.803586 4734 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.803723 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 23:19:59.803695273 +0000 UTC m=+20.487099549 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.904227 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:19:58 crc kubenswrapper[4734]: I1205 23:19:58.904286 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.904471 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.904492 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.904505 4734 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.904585 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 23:19:59.904565422 +0000 UTC m=+20.587969698 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.904648 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.904660 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.904669 4734 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:19:58 crc kubenswrapper[4734]: E1205 23:19:58.904693 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 23:19:59.904686155 +0000 UTC m=+20.588090431 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.617053 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.617961 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.619149 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.620496 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.621332 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.622589 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.623382 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.624160 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.625855 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.626736 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.627368 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.628245 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.628974 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.629671 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.630345 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.630696 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:19:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.631046 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.631824 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.632335 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.633113 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.633914 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.634549 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.635264 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.636929 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.637643 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.638131 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.638744 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.639378 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.639895 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.640483 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.640963 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.641425 4734 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.641590 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.642946 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.643453 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.643920 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.645043 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.648850 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:19:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.648906 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.650031 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.651030 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.652152 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.652939 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.653917 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.654969 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.657161 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.657764 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.658360 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.658981 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.660097 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.660676 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.661219 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.661748 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.662417 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.663129 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.663685 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.664414 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:19:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.680408 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:19:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.698444 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:19:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.714637 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:19:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.736375 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:19:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.753919 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:19:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.760201 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade"} Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.760311 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e784c5666c30414d7b774d9d2260403e35cff200d7b2f6321316628aedd09054"} Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.762480 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0"} Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.762566 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18"} Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.763465 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"6a9312f379a1398721f41af4e63b4ce28a465a4ebe257be69a6c0f07589d1cc6"} Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.778124 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:19:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.793430 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:19:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.810632 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:19:59 crc kubenswrapper[4734]: E1205 23:19:59.810763 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:20:01.810737219 +0000 UTC m=+22.494141495 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.810857 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.810906 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:19:59 crc kubenswrapper[4734]: E1205 23:19:59.811108 4734 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 23:19:59 crc kubenswrapper[4734]: E1205 23:19:59.811743 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 23:20:01.81116936 +0000 UTC m=+22.494573656 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 23:19:59 crc kubenswrapper[4734]: E1205 23:19:59.813205 4734 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 23:19:59 crc kubenswrapper[4734]: E1205 23:19:59.813280 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 23:20:01.813263319 +0000 UTC m=+22.496667615 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.817477 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:19:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.834371 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:19:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.849709 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:19:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.864590 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:19:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.886675 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:19:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.902055 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:19:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.912142 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.912188 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:19:59 crc kubenswrapper[4734]: E1205 23:19:59.912323 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 23:19:59 crc kubenswrapper[4734]: E1205 23:19:59.912339 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 23:19:59 crc kubenswrapper[4734]: E1205 23:19:59.912351 4734 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:19:59 crc kubenswrapper[4734]: E1205 23:19:59.912379 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 23:19:59 crc kubenswrapper[4734]: E1205 23:19:59.912427 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 23:19:59 crc kubenswrapper[4734]: E1205 23:19:59.912446 4734 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:19:59 crc kubenswrapper[4734]: E1205 23:19:59.912411 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 23:20:01.912396088 +0000 UTC m=+22.595800364 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:19:59 crc kubenswrapper[4734]: E1205 23:19:59.912519 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 23:20:01.91250971 +0000 UTC m=+22.595913986 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.914807 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:19:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.929969 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:19:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.942872 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:19:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.957755 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:19:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.979886 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:19:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:19:59 crc kubenswrapper[4734]: I1205 23:19:59.999215 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:19:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.064737 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:00Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.095072 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:00Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.119115 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-9l87s"] Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.119558 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9l87s" Dec 05 23:20:00 crc kubenswrapper[4734]: W1205 23:20:00.125887 4734 reflector.go:561] object-"openshift-image-registry"/"image-registry-certificates": failed to list *v1.ConfigMap: configmaps "image-registry-certificates" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Dec 05 23:20:00 crc kubenswrapper[4734]: W1205 23:20:00.125928 4734 reflector.go:561] object-"openshift-image-registry"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Dec 05 23:20:00 crc kubenswrapper[4734]: E1205 23:20:00.125938 4734 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"image-registry-certificates\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"image-registry-certificates\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 23:20:00 crc kubenswrapper[4734]: E1205 23:20:00.125993 4734 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 23:20:00 crc kubenswrapper[4734]: W1205 23:20:00.125928 4734 reflector.go:561] object-"openshift-image-registry"/"node-ca-dockercfg-4777p": failed to list *v1.Secret: secrets "node-ca-dockercfg-4777p" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Dec 05 23:20:00 crc kubenswrapper[4734]: E1205 23:20:00.126022 4734 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4777p\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"node-ca-dockercfg-4777p\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 23:20:00 crc kubenswrapper[4734]: W1205 23:20:00.126798 4734 reflector.go:561] object-"openshift-image-registry"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Dec 05 23:20:00 crc kubenswrapper[4734]: E1205 23:20:00.126824 4734 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.136880 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-bfxx2"] Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.137340 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bfxx2" Dec 05 23:20:00 crc kubenswrapper[4734]: W1205 23:20:00.141156 4734 reflector.go:561] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": failed to list *v1.Secret: secrets "node-resolver-dockercfg-kz9s7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Dec 05 23:20:00 crc kubenswrapper[4734]: E1205 23:20:00.141206 4734 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"node-resolver-dockercfg-kz9s7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"node-resolver-dockercfg-kz9s7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.141231 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.142761 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.152793 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:00Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.178931 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:00Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.209061 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:00Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.214324 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6eebee8c-1183-4010-b59c-8f880a4e669d-serviceca\") pod \"node-ca-9l87s\" (UID: \"6eebee8c-1183-4010-b59c-8f880a4e669d\") " pod="openshift-image-registry/node-ca-9l87s" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.214393 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6eebee8c-1183-4010-b59c-8f880a4e669d-host\") pod \"node-ca-9l87s\" (UID: \"6eebee8c-1183-4010-b59c-8f880a4e669d\") " pod="openshift-image-registry/node-ca-9l87s" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.214411 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh74z\" (UniqueName: \"kubernetes.io/projected/6eebee8c-1183-4010-b59c-8f880a4e669d-kube-api-access-rh74z\") pod \"node-ca-9l87s\" (UID: \"6eebee8c-1183-4010-b59c-8f880a4e669d\") " pod="openshift-image-registry/node-ca-9l87s" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.214432 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8tnw\" (UniqueName: \"kubernetes.io/projected/a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1-kube-api-access-h8tnw\") pod \"node-resolver-bfxx2\" (UID: \"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\") " pod="openshift-dns/node-resolver-bfxx2" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.214452 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1-hosts-file\") pod \"node-resolver-bfxx2\" (UID: \"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\") " pod="openshift-dns/node-resolver-bfxx2" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.246976 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:00Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.268375 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:00Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.295671 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:00Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.315150 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6eebee8c-1183-4010-b59c-8f880a4e669d-host\") pod \"node-ca-9l87s\" (UID: \"6eebee8c-1183-4010-b59c-8f880a4e669d\") " pod="openshift-image-registry/node-ca-9l87s" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.315198 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh74z\" (UniqueName: \"kubernetes.io/projected/6eebee8c-1183-4010-b59c-8f880a4e669d-kube-api-access-rh74z\") pod \"node-ca-9l87s\" (UID: \"6eebee8c-1183-4010-b59c-8f880a4e669d\") " pod="openshift-image-registry/node-ca-9l87s" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.315222 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8tnw\" (UniqueName: \"kubernetes.io/projected/a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1-kube-api-access-h8tnw\") pod \"node-resolver-bfxx2\" (UID: \"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\") " pod="openshift-dns/node-resolver-bfxx2" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.315245 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1-hosts-file\") pod \"node-resolver-bfxx2\" (UID: \"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\") " pod="openshift-dns/node-resolver-bfxx2" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.315271 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6eebee8c-1183-4010-b59c-8f880a4e669d-serviceca\") pod \"node-ca-9l87s\" (UID: \"6eebee8c-1183-4010-b59c-8f880a4e669d\") " pod="openshift-image-registry/node-ca-9l87s" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.315333 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6eebee8c-1183-4010-b59c-8f880a4e669d-host\") pod \"node-ca-9l87s\" (UID: \"6eebee8c-1183-4010-b59c-8f880a4e669d\") " pod="openshift-image-registry/node-ca-9l87s" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.315481 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1-hosts-file\") pod \"node-resolver-bfxx2\" (UID: \"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\") " pod="openshift-dns/node-resolver-bfxx2" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.324062 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:00Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.344971 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:00Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.365470 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:00Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.384269 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:00Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.398062 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8tnw\" (UniqueName: \"kubernetes.io/projected/a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1-kube-api-access-h8tnw\") pod \"node-resolver-bfxx2\" (UID: \"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\") " pod="openshift-dns/node-resolver-bfxx2" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.416343 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:00Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.431900 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:00Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.457089 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:00Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.478004 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:00Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.495989 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:00Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.511267 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:00Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.531838 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:00Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.546077 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:00Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.563377 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:00Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.614055 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.614131 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:00 crc kubenswrapper[4734]: I1205 23:20:00.614063 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:00 crc kubenswrapper[4734]: E1205 23:20:00.614250 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:00 crc kubenswrapper[4734]: E1205 23:20:00.614373 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:00 crc kubenswrapper[4734]: E1205 23:20:00.614480 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.020785 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-vn94d"] Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.021319 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.024264 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.024391 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.024688 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-k52tb"] Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.024764 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.024899 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.025714 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-d6kmh"] Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.026040 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.026068 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-k52tb" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.028054 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.028072 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.028073 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.028192 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.028262 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.028358 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.028566 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.030870 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.043045 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.058444 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.077281 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.092734 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.105089 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.119138 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.122940 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7da080e9-7084-4e77-9e1a-051dc8b97f25-cnibin\") pod \"multus-additional-cni-plugins-k52tb\" (UID: \"7da080e9-7084-4e77-9e1a-051dc8b97f25\") " pod="openshift-multus/multus-additional-cni-plugins-k52tb" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.123007 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-host-var-lib-cni-multus\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.123034 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-multus-conf-dir\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.123071 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-os-release\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.123099 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-multus-socket-dir-parent\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.123129 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-host-run-netns\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.123258 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2r4r\" (UniqueName: \"kubernetes.io/projected/7da080e9-7084-4e77-9e1a-051dc8b97f25-kube-api-access-x2r4r\") pod \"multus-additional-cni-plugins-k52tb\" (UID: \"7da080e9-7084-4e77-9e1a-051dc8b97f25\") " pod="openshift-multus/multus-additional-cni-plugins-k52tb" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.123323 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7da080e9-7084-4e77-9e1a-051dc8b97f25-system-cni-dir\") pod \"multus-additional-cni-plugins-k52tb\" (UID: \"7da080e9-7084-4e77-9e1a-051dc8b97f25\") " pod="openshift-multus/multus-additional-cni-plugins-k52tb" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.123352 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/65758270-a7a7-46b5-af95-0588daf9fa86-mcd-auth-proxy-config\") pod \"machine-config-daemon-vn94d\" (UID: \"65758270-a7a7-46b5-af95-0588daf9fa86\") " pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.123375 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-hostroot\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.123481 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/65758270-a7a7-46b5-af95-0588daf9fa86-rootfs\") pod \"machine-config-daemon-vn94d\" (UID: \"65758270-a7a7-46b5-af95-0588daf9fa86\") " pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.123546 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-host-run-multus-certs\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.123581 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7da080e9-7084-4e77-9e1a-051dc8b97f25-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k52tb\" (UID: \"7da080e9-7084-4e77-9e1a-051dc8b97f25\") " pod="openshift-multus/multus-additional-cni-plugins-k52tb" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.123627 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65758270-a7a7-46b5-af95-0588daf9fa86-proxy-tls\") pod \"machine-config-daemon-vn94d\" (UID: \"65758270-a7a7-46b5-af95-0588daf9fa86\") " pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.123658 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-system-cni-dir\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.123712 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7da080e9-7084-4e77-9e1a-051dc8b97f25-cni-binary-copy\") pod \"multus-additional-cni-plugins-k52tb\" (UID: \"7da080e9-7084-4e77-9e1a-051dc8b97f25\") " pod="openshift-multus/multus-additional-cni-plugins-k52tb" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.123745 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-cnibin\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.123771 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-cni-binary-copy\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.123804 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7da080e9-7084-4e77-9e1a-051dc8b97f25-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k52tb\" (UID: \"7da080e9-7084-4e77-9e1a-051dc8b97f25\") " pod="openshift-multus/multus-additional-cni-plugins-k52tb" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.123828 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-host-var-lib-kubelet\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.123855 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-etc-kubernetes\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.123887 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7da080e9-7084-4e77-9e1a-051dc8b97f25-os-release\") pod \"multus-additional-cni-plugins-k52tb\" (UID: \"7da080e9-7084-4e77-9e1a-051dc8b97f25\") " pod="openshift-multus/multus-additional-cni-plugins-k52tb" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.123912 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-host-var-lib-cni-bin\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.123937 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-multus-cni-dir\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.123959 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-multus-daemon-config\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.123980 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js9qp\" (UniqueName: \"kubernetes.io/projected/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-kube-api-access-js9qp\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.124092 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m65jq\" (UniqueName: \"kubernetes.io/projected/65758270-a7a7-46b5-af95-0588daf9fa86-kube-api-access-m65jq\") pod \"machine-config-daemon-vn94d\" (UID: \"65758270-a7a7-46b5-af95-0588daf9fa86\") " pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.124136 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-host-run-k8s-cni-cncf-io\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.135114 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.147165 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.161514 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.180838 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.195285 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.209231 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.225498 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7da080e9-7084-4e77-9e1a-051dc8b97f25-os-release\") pod \"multus-additional-cni-plugins-k52tb\" (UID: \"7da080e9-7084-4e77-9e1a-051dc8b97f25\") " pod="openshift-multus/multus-additional-cni-plugins-k52tb" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.225591 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-host-var-lib-cni-bin\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.225567 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.225620 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-multus-cni-dir\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.225659 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m65jq\" (UniqueName: \"kubernetes.io/projected/65758270-a7a7-46b5-af95-0588daf9fa86-kube-api-access-m65jq\") pod \"machine-config-daemon-vn94d\" (UID: \"65758270-a7a7-46b5-af95-0588daf9fa86\") " pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.225684 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-host-run-k8s-cni-cncf-io\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.225706 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-multus-daemon-config\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.225739 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js9qp\" (UniqueName: \"kubernetes.io/projected/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-kube-api-access-js9qp\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.225774 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7da080e9-7084-4e77-9e1a-051dc8b97f25-cnibin\") pod \"multus-additional-cni-plugins-k52tb\" (UID: \"7da080e9-7084-4e77-9e1a-051dc8b97f25\") " pod="openshift-multus/multus-additional-cni-plugins-k52tb" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.225802 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-host-var-lib-cni-multus\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.225826 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-multus-conf-dir\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.225827 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-host-var-lib-cni-bin\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.225936 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-host-run-k8s-cni-cncf-io\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226049 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-multus-cni-dir\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226113 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7da080e9-7084-4e77-9e1a-051dc8b97f25-cnibin\") pod \"multus-additional-cni-plugins-k52tb\" (UID: \"7da080e9-7084-4e77-9e1a-051dc8b97f25\") " pod="openshift-multus/multus-additional-cni-plugins-k52tb" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226102 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7da080e9-7084-4e77-9e1a-051dc8b97f25-os-release\") pod \"multus-additional-cni-plugins-k52tb\" (UID: \"7da080e9-7084-4e77-9e1a-051dc8b97f25\") " pod="openshift-multus/multus-additional-cni-plugins-k52tb" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226152 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-os-release\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226165 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-host-var-lib-cni-multus\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226199 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-multus-conf-dir\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.225863 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-os-release\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226301 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-multus-socket-dir-parent\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226352 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-host-run-netns\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226387 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-multus-socket-dir-parent\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226391 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2r4r\" (UniqueName: \"kubernetes.io/projected/7da080e9-7084-4e77-9e1a-051dc8b97f25-kube-api-access-x2r4r\") pod \"multus-additional-cni-plugins-k52tb\" (UID: \"7da080e9-7084-4e77-9e1a-051dc8b97f25\") " pod="openshift-multus/multus-additional-cni-plugins-k52tb" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226451 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-host-run-netns\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226455 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7da080e9-7084-4e77-9e1a-051dc8b97f25-system-cni-dir\") pod \"multus-additional-cni-plugins-k52tb\" (UID: \"7da080e9-7084-4e77-9e1a-051dc8b97f25\") " pod="openshift-multus/multus-additional-cni-plugins-k52tb" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226480 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7da080e9-7084-4e77-9e1a-051dc8b97f25-system-cni-dir\") pod \"multus-additional-cni-plugins-k52tb\" (UID: \"7da080e9-7084-4e77-9e1a-051dc8b97f25\") " pod="openshift-multus/multus-additional-cni-plugins-k52tb" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226569 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/65758270-a7a7-46b5-af95-0588daf9fa86-rootfs\") pod \"machine-config-daemon-vn94d\" (UID: \"65758270-a7a7-46b5-af95-0588daf9fa86\") " pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226591 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/65758270-a7a7-46b5-af95-0588daf9fa86-mcd-auth-proxy-config\") pod \"machine-config-daemon-vn94d\" (UID: \"65758270-a7a7-46b5-af95-0588daf9fa86\") " pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226609 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-hostroot\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226627 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-host-run-multus-certs\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226648 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7da080e9-7084-4e77-9e1a-051dc8b97f25-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k52tb\" (UID: \"7da080e9-7084-4e77-9e1a-051dc8b97f25\") " pod="openshift-multus/multus-additional-cni-plugins-k52tb" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226681 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-hostroot\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226704 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65758270-a7a7-46b5-af95-0588daf9fa86-proxy-tls\") pod \"machine-config-daemon-vn94d\" (UID: \"65758270-a7a7-46b5-af95-0588daf9fa86\") " pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226724 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-host-run-multus-certs\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226738 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-system-cni-dir\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226774 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7da080e9-7084-4e77-9e1a-051dc8b97f25-cni-binary-copy\") pod \"multus-additional-cni-plugins-k52tb\" (UID: \"7da080e9-7084-4e77-9e1a-051dc8b97f25\") " pod="openshift-multus/multus-additional-cni-plugins-k52tb" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226795 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-multus-daemon-config\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226812 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-cnibin\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226835 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-cni-binary-copy\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226855 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7da080e9-7084-4e77-9e1a-051dc8b97f25-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k52tb\" (UID: \"7da080e9-7084-4e77-9e1a-051dc8b97f25\") " pod="openshift-multus/multus-additional-cni-plugins-k52tb" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226873 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-host-var-lib-kubelet\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226874 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-system-cni-dir\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226914 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/65758270-a7a7-46b5-af95-0588daf9fa86-rootfs\") pod \"machine-config-daemon-vn94d\" (UID: \"65758270-a7a7-46b5-af95-0588daf9fa86\") " pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226949 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-cnibin\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226929 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-etc-kubernetes\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.226987 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-host-var-lib-kubelet\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.227159 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7da080e9-7084-4e77-9e1a-051dc8b97f25-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k52tb\" (UID: \"7da080e9-7084-4e77-9e1a-051dc8b97f25\") " pod="openshift-multus/multus-additional-cni-plugins-k52tb" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.227171 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-etc-kubernetes\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.227724 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/65758270-a7a7-46b5-af95-0588daf9fa86-mcd-auth-proxy-config\") pod \"machine-config-daemon-vn94d\" (UID: \"65758270-a7a7-46b5-af95-0588daf9fa86\") " pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.227800 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-cni-binary-copy\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.227966 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7da080e9-7084-4e77-9e1a-051dc8b97f25-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k52tb\" (UID: \"7da080e9-7084-4e77-9e1a-051dc8b97f25\") " pod="openshift-multus/multus-additional-cni-plugins-k52tb" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.227966 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7da080e9-7084-4e77-9e1a-051dc8b97f25-cni-binary-copy\") pod \"multus-additional-cni-plugins-k52tb\" (UID: \"7da080e9-7084-4e77-9e1a-051dc8b97f25\") " pod="openshift-multus/multus-additional-cni-plugins-k52tb" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.231897 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65758270-a7a7-46b5-af95-0588daf9fa86-proxy-tls\") pod \"machine-config-daemon-vn94d\" (UID: \"65758270-a7a7-46b5-af95-0588daf9fa86\") " pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.242463 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m65jq\" (UniqueName: \"kubernetes.io/projected/65758270-a7a7-46b5-af95-0588daf9fa86-kube-api-access-m65jq\") pod \"machine-config-daemon-vn94d\" (UID: \"65758270-a7a7-46b5-af95-0588daf9fa86\") " pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.244409 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js9qp\" (UniqueName: \"kubernetes.io/projected/1d76dc4e-40f3-4457-9a99-16f9a8ca8081-kube-api-access-js9qp\") pod \"multus-d6kmh\" (UID: \"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\") " pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.246596 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.249917 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2r4r\" (UniqueName: \"kubernetes.io/projected/7da080e9-7084-4e77-9e1a-051dc8b97f25-kube-api-access-x2r4r\") pod \"multus-additional-cni-plugins-k52tb\" (UID: \"7da080e9-7084-4e77-9e1a-051dc8b97f25\") " pod="openshift-multus/multus-additional-cni-plugins-k52tb" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.261028 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.276303 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.294824 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.309754 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.311274 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 05 23:20:01 crc kubenswrapper[4734]: E1205 23:20:01.315633 4734 configmap.go:193] Couldn't get configMap openshift-image-registry/image-registry-certificates: failed to sync configmap cache: timed out waiting for the condition Dec 05 23:20:01 crc kubenswrapper[4734]: E1205 23:20:01.315744 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6eebee8c-1183-4010-b59c-8f880a4e669d-serviceca podName:6eebee8c-1183-4010-b59c-8f880a4e669d nodeName:}" failed. No retries permitted until 2025-12-05 23:20:01.815708471 +0000 UTC m=+22.499112747 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serviceca" (UniqueName: "kubernetes.io/configmap/6eebee8c-1183-4010-b59c-8f880a4e669d-serviceca") pod "node-ca-9l87s" (UID: "6eebee8c-1183-4010-b59c-8f880a4e669d") : failed to sync configmap cache: timed out waiting for the condition Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.318805 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bfxx2" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.325561 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: E1205 23:20:01.334995 4734 projected.go:288] Couldn't get configMap openshift-image-registry/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.335217 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 05 23:20:01 crc kubenswrapper[4734]: W1205 23:20:01.336827 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2f57d8d_f8e7_4ccc_b41f_26ebca61d0f1.slice/crio-d06a3d838178b90b5573c33f6c55ceac5347b5f00f76cb3caed4bee04202dd0d WatchSource:0}: Error finding container d06a3d838178b90b5573c33f6c55ceac5347b5f00f76cb3caed4bee04202dd0d: Status 404 returned error can't find the container with id d06a3d838178b90b5573c33f6c55ceac5347b5f00f76cb3caed4bee04202dd0d Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.340932 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.341773 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d6kmh" Dec 05 23:20:01 crc kubenswrapper[4734]: W1205 23:20:01.347902 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65758270_a7a7_46b5_af95_0588daf9fa86.slice/crio-0d2717549ed28ed24e960afc6904acf6cb7924447ccc485d5ebfbbce000b70b5 WatchSource:0}: Error finding container 0d2717549ed28ed24e960afc6904acf6cb7924447ccc485d5ebfbbce000b70b5: Status 404 returned error can't find the container with id 0d2717549ed28ed24e960afc6904acf6cb7924447ccc485d5ebfbbce000b70b5 Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.348011 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-k52tb" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.356758 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: W1205 23:20:01.365929 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d76dc4e_40f3_4457_9a99_16f9a8ca8081.slice/crio-5e9ff5eb87fa1b3e135fd9c821696cc8253402084a10bc4d49b2c7a33e3c3fdb WatchSource:0}: Error finding container 5e9ff5eb87fa1b3e135fd9c821696cc8253402084a10bc4d49b2c7a33e3c3fdb: Status 404 returned error can't find the container with id 5e9ff5eb87fa1b3e135fd9c821696cc8253402084a10bc4d49b2c7a33e3c3fdb Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.370294 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.373829 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: W1205 23:20:01.375730 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7da080e9_7084_4e77_9e1a_051dc8b97f25.slice/crio-1ef75c41581538d27f4f7c4a950b49fd9545fbc5c1e378556b39de967f4b9e88 WatchSource:0}: Error finding container 1ef75c41581538d27f4f7c4a950b49fd9545fbc5c1e378556b39de967f4b9e88: Status 404 returned error can't find the container with id 1ef75c41581538d27f4f7c4a950b49fd9545fbc5c1e378556b39de967f4b9e88 Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.394745 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.404550 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8bfg7"] Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.405762 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.410775 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.410969 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.411246 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.411395 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.411569 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.411717 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.411878 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.412684 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.413429 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.427855 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.443159 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.457180 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.468640 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.479647 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.482818 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: E1205 23:20:01.485167 4734 projected.go:194] Error preparing data for projected volume kube-api-access-rh74z for pod openshift-image-registry/node-ca-9l87s: failed to sync configmap cache: timed out waiting for the condition Dec 05 23:20:01 crc kubenswrapper[4734]: E1205 23:20:01.485380 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6eebee8c-1183-4010-b59c-8f880a4e669d-kube-api-access-rh74z podName:6eebee8c-1183-4010-b59c-8f880a4e669d nodeName:}" failed. No retries permitted until 2025-12-05 23:20:01.985328859 +0000 UTC m=+22.668733135 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rh74z" (UniqueName: "kubernetes.io/projected/6eebee8c-1183-4010-b59c-8f880a4e669d-kube-api-access-rh74z") pod "node-ca-9l87s" (UID: "6eebee8c-1183-4010-b59c-8f880a4e669d") : failed to sync configmap cache: timed out waiting for the condition Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.503264 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.524889 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.529926 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-etc-openvswitch\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.529969 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-cni-netd\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.529991 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2927a376-2f69-4820-a222-b86f08ece55a-ovnkube-config\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.530012 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2927a376-2f69-4820-a222-b86f08ece55a-env-overrides\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.530033 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-run-openvswitch\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.530058 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-log-socket\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.530079 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-run-ovn-kubernetes\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.530111 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-node-log\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.530136 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2927a376-2f69-4820-a222-b86f08ece55a-ovn-node-metrics-cert\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.530158 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2927a376-2f69-4820-a222-b86f08ece55a-ovnkube-script-lib\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.530181 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-slash\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.530206 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-var-lib-openvswitch\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.530229 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-run-ovn\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.530250 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.530433 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-systemd-units\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.530484 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-run-netns\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.530609 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssw64\" (UniqueName: \"kubernetes.io/projected/2927a376-2f69-4820-a222-b86f08ece55a-kube-api-access-ssw64\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.530754 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-cni-bin\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.530813 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-run-systemd\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.530847 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-kubelet\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.541200 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.554196 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.568685 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.593897 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.607984 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.620505 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.631325 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-systemd-units\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.631370 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-run-netns\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.631389 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssw64\" (UniqueName: \"kubernetes.io/projected/2927a376-2f69-4820-a222-b86f08ece55a-kube-api-access-ssw64\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.631416 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-cni-bin\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.631442 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-run-systemd\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.631462 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-kubelet\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.631506 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-cni-netd\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.631520 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-etc-openvswitch\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.631560 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2927a376-2f69-4820-a222-b86f08ece55a-ovnkube-config\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.631575 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2927a376-2f69-4820-a222-b86f08ece55a-env-overrides\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.631558 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-systemd-units\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.631591 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-run-openvswitch\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.631641 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-run-openvswitch\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.631696 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-cni-netd\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.631725 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-etc-openvswitch\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.632084 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-cni-bin\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.632124 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-run-systemd\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.632191 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-log-socket\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.632215 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-run-ovn-kubernetes\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.632396 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-kubelet\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.632439 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-run-netns\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.632451 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2927a376-2f69-4820-a222-b86f08ece55a-ovnkube-config\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.632484 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-run-ovn-kubernetes\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.632548 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-log-socket\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.632637 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-node-log\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.632674 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2927a376-2f69-4820-a222-b86f08ece55a-ovn-node-metrics-cert\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.632712 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2927a376-2f69-4820-a222-b86f08ece55a-ovnkube-script-lib\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.632728 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-node-log\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.632743 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-slash\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.632767 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-var-lib-openvswitch\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.632793 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-run-ovn\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.632820 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.632829 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2927a376-2f69-4820-a222-b86f08ece55a-env-overrides\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.632883 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-slash\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.632919 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-var-lib-openvswitch\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.632953 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-run-ovn\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.632986 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.633441 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2927a376-2f69-4820-a222-b86f08ece55a-ovnkube-script-lib\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.637032 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2927a376-2f69-4820-a222-b86f08ece55a-ovn-node-metrics-cert\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.674968 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.680399 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssw64\" (UniqueName: \"kubernetes.io/projected/2927a376-2f69-4820-a222-b86f08ece55a-kube-api-access-ssw64\") pod \"ovnkube-node-8bfg7\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.701072 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.770719 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d6kmh" event={"ID":"1d76dc4e-40f3-4457-9a99-16f9a8ca8081","Type":"ContainerStarted","Data":"ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8"} Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.770771 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d6kmh" event={"ID":"1d76dc4e-40f3-4457-9a99-16f9a8ca8081","Type":"ContainerStarted","Data":"5e9ff5eb87fa1b3e135fd9c821696cc8253402084a10bc4d49b2c7a33e3c3fdb"} Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.773397 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerStarted","Data":"f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704"} Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.773470 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerStarted","Data":"2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18"} Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.773487 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerStarted","Data":"0d2717549ed28ed24e960afc6904acf6cb7924447ccc485d5ebfbbce000b70b5"} Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.775001 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bfxx2" event={"ID":"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1","Type":"ContainerStarted","Data":"2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0"} Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.775045 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bfxx2" event={"ID":"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1","Type":"ContainerStarted","Data":"d06a3d838178b90b5573c33f6c55ceac5347b5f00f76cb3caed4bee04202dd0d"} Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.776515 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a"} Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.778219 4734 generic.go:334] "Generic (PLEG): container finished" podID="7da080e9-7084-4e77-9e1a-051dc8b97f25" containerID="a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a" exitCode=0 Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.778248 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" event={"ID":"7da080e9-7084-4e77-9e1a-051dc8b97f25","Type":"ContainerDied","Data":"a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a"} Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.778262 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" event={"ID":"7da080e9-7084-4e77-9e1a-051dc8b97f25","Type":"ContainerStarted","Data":"1ef75c41581538d27f4f7c4a950b49fd9545fbc5c1e378556b39de967f4b9e88"} Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.796167 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.809295 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.825473 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.828412 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.834663 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.834821 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6eebee8c-1183-4010-b59c-8f880a4e669d-serviceca\") pod \"node-ca-9l87s\" (UID: \"6eebee8c-1183-4010-b59c-8f880a4e669d\") " pod="openshift-image-registry/node-ca-9l87s" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.834890 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.834938 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:01 crc kubenswrapper[4734]: E1205 23:20:01.835055 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:20:05.835012493 +0000 UTC m=+26.518416769 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:20:01 crc kubenswrapper[4734]: E1205 23:20:01.835779 4734 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 23:20:01 crc kubenswrapper[4734]: E1205 23:20:01.835839 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 23:20:05.835827673 +0000 UTC m=+26.519231959 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 23:20:01 crc kubenswrapper[4734]: E1205 23:20:01.836062 4734 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 23:20:01 crc kubenswrapper[4734]: E1205 23:20:01.836178 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 23:20:05.83614205 +0000 UTC m=+26.519546506 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.837497 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6eebee8c-1183-4010-b59c-8f880a4e669d-serviceca\") pod \"node-ca-9l87s\" (UID: \"6eebee8c-1183-4010-b59c-8f880a4e669d\") " pod="openshift-image-registry/node-ca-9l87s" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.852941 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.865155 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.879339 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.902171 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.925921 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.936149 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.936233 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:01 crc kubenswrapper[4734]: E1205 23:20:01.936974 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 23:20:01 crc kubenswrapper[4734]: E1205 23:20:01.937024 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 23:20:01 crc kubenswrapper[4734]: E1205 23:20:01.937041 4734 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:20:01 crc kubenswrapper[4734]: E1205 23:20:01.937065 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 23:20:01 crc kubenswrapper[4734]: E1205 23:20:01.937110 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 23:20:01 crc kubenswrapper[4734]: E1205 23:20:01.937120 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 23:20:05.937095871 +0000 UTC m=+26.620500147 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:20:01 crc kubenswrapper[4734]: E1205 23:20:01.937126 4734 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:20:01 crc kubenswrapper[4734]: E1205 23:20:01.937233 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 23:20:05.937206904 +0000 UTC m=+26.620611180 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:20:01 crc kubenswrapper[4734]: I1205 23:20:01.966364 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.005932 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.037075 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh74z\" (UniqueName: \"kubernetes.io/projected/6eebee8c-1183-4010-b59c-8f880a4e669d-kube-api-access-rh74z\") pod \"node-ca-9l87s\" (UID: \"6eebee8c-1183-4010-b59c-8f880a4e669d\") " pod="openshift-image-registry/node-ca-9l87s" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.042517 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh74z\" (UniqueName: \"kubernetes.io/projected/6eebee8c-1183-4010-b59c-8f880a4e669d-kube-api-access-rh74z\") pod \"node-ca-9l87s\" (UID: \"6eebee8c-1183-4010-b59c-8f880a4e669d\") " pod="openshift-image-registry/node-ca-9l87s" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.051072 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.083752 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.124682 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.164280 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.206901 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.231422 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9l87s" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.242183 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: W1205 23:20:02.249618 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6eebee8c_1183_4010_b59c_8f880a4e669d.slice/crio-fc8cf911a3cce4e89fc07d17cae65eac466adac549fbc8ab84ce082bc845c3d9 WatchSource:0}: Error finding container fc8cf911a3cce4e89fc07d17cae65eac466adac549fbc8ab84ce082bc845c3d9: Status 404 returned error can't find the container with id fc8cf911a3cce4e89fc07d17cae65eac466adac549fbc8ab84ce082bc845c3d9 Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.289316 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.327859 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.366511 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.402882 4734 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.406654 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.406898 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.406939 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.406950 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.407116 4734 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.463447 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.476429 4734 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.476992 4734 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.478619 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.478670 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.478682 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.478702 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.478715 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:02Z","lastTransitionTime":"2025-12-05T23:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:02 crc kubenswrapper[4734]: E1205 23:20:02.495891 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.499401 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.499442 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.499453 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.499472 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.499484 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:02Z","lastTransitionTime":"2025-12-05T23:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:02 crc kubenswrapper[4734]: E1205 23:20:02.516504 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.521196 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.521252 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.521268 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.521289 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.521307 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:02Z","lastTransitionTime":"2025-12-05T23:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.524232 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: E1205 23:20:02.534778 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.539639 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.539696 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.539710 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.539733 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.539749 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:02Z","lastTransitionTime":"2025-12-05T23:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:02 crc kubenswrapper[4734]: E1205 23:20:02.552770 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.557317 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.557391 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.557408 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.557435 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.557453 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:02Z","lastTransitionTime":"2025-12-05T23:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.563567 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: E1205 23:20:02.569765 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: E1205 23:20:02.569881 4734 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.571736 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.571772 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.571785 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.571803 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.571813 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:02Z","lastTransitionTime":"2025-12-05T23:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.602998 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.613830 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.613857 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.613965 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:02 crc kubenswrapper[4734]: E1205 23:20:02.614004 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:02 crc kubenswrapper[4734]: E1205 23:20:02.614197 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:02 crc kubenswrapper[4734]: E1205 23:20:02.614436 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.649172 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.674566 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.674605 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.674617 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.674635 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.674646 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:02Z","lastTransitionTime":"2025-12-05T23:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.691251 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.726769 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.766453 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.777768 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.777830 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.777844 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.777864 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.777881 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:02Z","lastTransitionTime":"2025-12-05T23:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.783263 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9l87s" event={"ID":"6eebee8c-1183-4010-b59c-8f880a4e669d","Type":"ContainerStarted","Data":"7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2"} Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.783349 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9l87s" event={"ID":"6eebee8c-1183-4010-b59c-8f880a4e669d","Type":"ContainerStarted","Data":"fc8cf911a3cce4e89fc07d17cae65eac466adac549fbc8ab84ce082bc845c3d9"} Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.785720 4734 generic.go:334] "Generic (PLEG): container finished" podID="7da080e9-7084-4e77-9e1a-051dc8b97f25" containerID="6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87" exitCode=0 Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.785769 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" event={"ID":"7da080e9-7084-4e77-9e1a-051dc8b97f25","Type":"ContainerDied","Data":"6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87"} Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.788981 4734 generic.go:334] "Generic (PLEG): container finished" podID="2927a376-2f69-4820-a222-b86f08ece55a" containerID="bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6" exitCode=0 Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.789109 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" event={"ID":"2927a376-2f69-4820-a222-b86f08ece55a","Type":"ContainerDied","Data":"bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6"} Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.789201 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" event={"ID":"2927a376-2f69-4820-a222-b86f08ece55a","Type":"ContainerStarted","Data":"ad55e3286c595ac94160c058992d98d313047da1faf163f9547e7e2c17cdfebc"} Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.807565 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.847637 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.882210 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.882615 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.882632 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.882655 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.882668 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:02Z","lastTransitionTime":"2025-12-05T23:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.885249 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.932257 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.963263 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:02Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.985853 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.985904 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.985916 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.985940 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:02 crc kubenswrapper[4734]: I1205 23:20:02.985955 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:02Z","lastTransitionTime":"2025-12-05T23:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.006032 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:03Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.054931 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:03Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.083669 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:03Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.088658 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.088721 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.088739 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.088764 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.088780 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:03Z","lastTransitionTime":"2025-12-05T23:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.124610 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:03Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.166121 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:03Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.191500 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.191562 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.191575 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.191592 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.191603 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:03Z","lastTransitionTime":"2025-12-05T23:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.205499 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:03Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.256978 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:03Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.294777 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.294835 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.294846 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.294900 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.294914 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:03Z","lastTransitionTime":"2025-12-05T23:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.307899 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:03Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.338921 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:03Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.364290 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:03Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.397232 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.397923 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.397942 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.397964 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.397979 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:03Z","lastTransitionTime":"2025-12-05T23:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.407491 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:03Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.449116 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:03Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.485749 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:03Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.500473 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.500536 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.500549 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.500564 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.500575 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:03Z","lastTransitionTime":"2025-12-05T23:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.527248 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:03Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.562438 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:03Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.603333 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:03Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.604123 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.604176 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.604186 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.604208 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.604224 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:03Z","lastTransitionTime":"2025-12-05T23:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.648429 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:03Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.682227 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:03Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.707905 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.707968 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.707998 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.708025 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.708041 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:03Z","lastTransitionTime":"2025-12-05T23:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.723776 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:03Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.771043 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:03Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.796203 4734 generic.go:334] "Generic (PLEG): container finished" podID="7da080e9-7084-4e77-9e1a-051dc8b97f25" containerID="c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328" exitCode=0 Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.796282 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" event={"ID":"7da080e9-7084-4e77-9e1a-051dc8b97f25","Type":"ContainerDied","Data":"c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328"} Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.801821 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" event={"ID":"2927a376-2f69-4820-a222-b86f08ece55a","Type":"ContainerStarted","Data":"c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e"} Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.801876 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" event={"ID":"2927a376-2f69-4820-a222-b86f08ece55a","Type":"ContainerStarted","Data":"6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4"} Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.801889 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" event={"ID":"2927a376-2f69-4820-a222-b86f08ece55a","Type":"ContainerStarted","Data":"846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1"} Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.801899 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" event={"ID":"2927a376-2f69-4820-a222-b86f08ece55a","Type":"ContainerStarted","Data":"69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946"} Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.801910 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" event={"ID":"2927a376-2f69-4820-a222-b86f08ece55a","Type":"ContainerStarted","Data":"bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7"} Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.801920 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" event={"ID":"2927a376-2f69-4820-a222-b86f08ece55a","Type":"ContainerStarted","Data":"d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a"} Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.810386 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.810646 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.810774 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.811053 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.811204 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:03Z","lastTransitionTime":"2025-12-05T23:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.816129 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:03Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.846790 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:03Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.887434 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:03Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.915323 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.915372 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.915383 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.915403 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.915415 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:03Z","lastTransitionTime":"2025-12-05T23:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.929590 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:03Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:03 crc kubenswrapper[4734]: I1205 23:20:03.964994 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:03Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.005549 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.018076 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.018119 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.018129 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.018146 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.018158 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:04Z","lastTransitionTime":"2025-12-05T23:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.044247 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.084830 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.120605 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.120669 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.120683 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.120709 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.120726 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:04Z","lastTransitionTime":"2025-12-05T23:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.128444 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.165379 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.205034 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.223607 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.223675 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.223686 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.223707 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.223721 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:04Z","lastTransitionTime":"2025-12-05T23:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.245482 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.284600 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.327237 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.327293 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.327305 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.327326 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.327337 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:04Z","lastTransitionTime":"2025-12-05T23:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.329154 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.366735 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.403629 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.430819 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.430877 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.430891 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.430914 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.430926 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:04Z","lastTransitionTime":"2025-12-05T23:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.445040 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.534136 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.534202 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.534217 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.534242 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.534257 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:04Z","lastTransitionTime":"2025-12-05T23:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.613133 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.613258 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:04 crc kubenswrapper[4734]: E1205 23:20:04.613323 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.613378 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:04 crc kubenswrapper[4734]: E1205 23:20:04.613476 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:04 crc kubenswrapper[4734]: E1205 23:20:04.613599 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.636862 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.636923 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.636963 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.636997 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.637014 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:04Z","lastTransitionTime":"2025-12-05T23:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.740332 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.740408 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.740449 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.740481 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.740500 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:04Z","lastTransitionTime":"2025-12-05T23:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.810175 4734 generic.go:334] "Generic (PLEG): container finished" podID="7da080e9-7084-4e77-9e1a-051dc8b97f25" containerID="e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112" exitCode=0 Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.810231 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" event={"ID":"7da080e9-7084-4e77-9e1a-051dc8b97f25","Type":"ContainerDied","Data":"e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112"} Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.825619 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.843079 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.843199 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.843342 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.843351 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.843370 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.843383 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:04Z","lastTransitionTime":"2025-12-05T23:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.862014 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.874806 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.889683 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.901672 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.916318 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.931743 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.945764 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.946179 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.946219 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.946229 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.946250 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.946261 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:04Z","lastTransitionTime":"2025-12-05T23:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.961490 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.974393 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.986268 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:04 crc kubenswrapper[4734]: I1205 23:20:04.999481 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.020254 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:05Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.049930 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.050008 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.050026 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.050052 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.050069 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:05Z","lastTransitionTime":"2025-12-05T23:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.153267 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.153324 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.153335 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.153354 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.153564 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:05Z","lastTransitionTime":"2025-12-05T23:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.256941 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.256989 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.257001 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.257022 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.257037 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:05Z","lastTransitionTime":"2025-12-05T23:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.360304 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.360358 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.360370 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.360391 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.360403 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:05Z","lastTransitionTime":"2025-12-05T23:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.464676 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.464761 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.464783 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.464811 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.464831 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:05Z","lastTransitionTime":"2025-12-05T23:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.567946 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.568020 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.568044 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.568070 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.568090 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:05Z","lastTransitionTime":"2025-12-05T23:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.672671 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.672748 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.672768 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.672801 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.672823 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:05Z","lastTransitionTime":"2025-12-05T23:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.776956 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.777032 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.777050 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.777079 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.777093 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:05Z","lastTransitionTime":"2025-12-05T23:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.819691 4734 generic.go:334] "Generic (PLEG): container finished" podID="7da080e9-7084-4e77-9e1a-051dc8b97f25" containerID="3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db" exitCode=0 Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.819781 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" event={"ID":"7da080e9-7084-4e77-9e1a-051dc8b97f25","Type":"ContainerDied","Data":"3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db"} Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.832170 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" event={"ID":"2927a376-2f69-4820-a222-b86f08ece55a","Type":"ContainerStarted","Data":"de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb"} Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.845764 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:05Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.863483 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:05Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.878307 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:05Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.880101 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.880166 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.880181 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.880213 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.880230 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:05Z","lastTransitionTime":"2025-12-05T23:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.893701 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.893660 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:05Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:05 crc kubenswrapper[4734]: E1205 23:20:05.893980 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:20:13.893954306 +0000 UTC m=+34.577358582 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.894049 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.894129 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:05 crc kubenswrapper[4734]: E1205 23:20:05.894265 4734 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 23:20:05 crc kubenswrapper[4734]: E1205 23:20:05.894339 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 23:20:13.894316145 +0000 UTC m=+34.577720421 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 23:20:05 crc kubenswrapper[4734]: E1205 23:20:05.894274 4734 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 23:20:05 crc kubenswrapper[4734]: E1205 23:20:05.894757 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 23:20:13.894742836 +0000 UTC m=+34.578147322 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.926306 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:05Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.943579 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:05Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.959449 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:05Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.981057 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:05Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.983775 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.983811 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.983826 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.983848 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.983862 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:05Z","lastTransitionTime":"2025-12-05T23:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.994812 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.994887 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:05 crc kubenswrapper[4734]: E1205 23:20:05.995059 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 23:20:05 crc kubenswrapper[4734]: E1205 23:20:05.995106 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 23:20:05 crc kubenswrapper[4734]: E1205 23:20:05.995125 4734 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:20:05 crc kubenswrapper[4734]: E1205 23:20:05.995123 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 23:20:05 crc kubenswrapper[4734]: E1205 23:20:05.995155 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 23:20:05 crc kubenswrapper[4734]: E1205 23:20:05.995170 4734 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:20:05 crc kubenswrapper[4734]: E1205 23:20:05.995223 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 23:20:13.995192664 +0000 UTC m=+34.678597120 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:20:05 crc kubenswrapper[4734]: E1205 23:20:05.995266 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 23:20:13.995245776 +0000 UTC m=+34.678650052 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:20:05 crc kubenswrapper[4734]: I1205 23:20:05.997912 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:05Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.013328 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:06Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.028883 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:06Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.041405 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:06Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.060961 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:06Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.077510 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:06Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.086906 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.086935 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.086947 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.086963 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.086976 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:06Z","lastTransitionTime":"2025-12-05T23:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.190446 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.190519 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.190597 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.190633 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.190658 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:06Z","lastTransitionTime":"2025-12-05T23:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.293382 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.293454 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.293473 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.293504 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.293562 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:06Z","lastTransitionTime":"2025-12-05T23:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.396256 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.396359 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.396380 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.396401 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.396421 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:06Z","lastTransitionTime":"2025-12-05T23:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.499804 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.499857 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.499869 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.499887 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.499905 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:06Z","lastTransitionTime":"2025-12-05T23:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.603108 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.603175 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.603189 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.603217 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.603233 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:06Z","lastTransitionTime":"2025-12-05T23:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.613377 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.613401 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.613618 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:06 crc kubenswrapper[4734]: E1205 23:20:06.613703 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:06 crc kubenswrapper[4734]: E1205 23:20:06.613773 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:06 crc kubenswrapper[4734]: E1205 23:20:06.613858 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.706369 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.706424 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.706436 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.706454 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.706465 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:06Z","lastTransitionTime":"2025-12-05T23:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.809172 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.809257 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.809273 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.809304 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.809324 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:06Z","lastTransitionTime":"2025-12-05T23:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.840353 4734 generic.go:334] "Generic (PLEG): container finished" podID="7da080e9-7084-4e77-9e1a-051dc8b97f25" containerID="95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7" exitCode=0 Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.840414 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" event={"ID":"7da080e9-7084-4e77-9e1a-051dc8b97f25","Type":"ContainerDied","Data":"95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7"} Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.883937 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:06Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.912306 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.912365 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.912379 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.912404 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.912417 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:06Z","lastTransitionTime":"2025-12-05T23:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.912344 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:06Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.929296 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:06Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.942191 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:06Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.956211 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:06Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.971754 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:06Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:06 crc kubenswrapper[4734]: I1205 23:20:06.985257 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:06Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.001031 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:06Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.014384 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:07Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.015138 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.015182 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.015195 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.015214 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.015227 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:07Z","lastTransitionTime":"2025-12-05T23:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.028628 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:07Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.042815 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:07Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.058455 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:07Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.074912 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:07Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.096700 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:07Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.118278 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.118326 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.118337 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.118357 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.118369 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:07Z","lastTransitionTime":"2025-12-05T23:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.221112 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.221178 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.221192 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.221214 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.221227 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:07Z","lastTransitionTime":"2025-12-05T23:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.324949 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.325026 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.325049 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.325081 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.325105 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:07Z","lastTransitionTime":"2025-12-05T23:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.428448 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.428558 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.428581 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.428613 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.428635 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:07Z","lastTransitionTime":"2025-12-05T23:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.532153 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.532218 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.532231 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.532256 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.532270 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:07Z","lastTransitionTime":"2025-12-05T23:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.635967 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.636041 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.636056 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.636080 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.636099 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:07Z","lastTransitionTime":"2025-12-05T23:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.739027 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.739099 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.739114 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.739138 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.739154 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:07Z","lastTransitionTime":"2025-12-05T23:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.843221 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.843294 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.843316 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.843347 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.843367 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:07Z","lastTransitionTime":"2025-12-05T23:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.849836 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" event={"ID":"7da080e9-7084-4e77-9e1a-051dc8b97f25","Type":"ContainerStarted","Data":"81ed109ca95328fcc458e818da95462a941b14b4a4ad494d73190e64ec494c2b"} Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.872709 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" event={"ID":"2927a376-2f69-4820-a222-b86f08ece55a","Type":"ContainerStarted","Data":"8b2ffde2a6354a726878c82fab03640d219e889aa358efdd008839c042bf9357"} Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.873152 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.877992 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:07Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.896062 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:07Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.909914 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.911519 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:07Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.930618 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:07Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.947812 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.947875 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.947896 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.947923 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.947944 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:07Z","lastTransitionTime":"2025-12-05T23:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.963878 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:07Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:07 crc kubenswrapper[4734]: I1205 23:20:07.989902 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ed109ca95328fcc458e818da95462a941b14b4a4ad494d73190e64ec494c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:07Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.009554 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:08Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.026166 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:08Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.045031 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:08Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.051563 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.051595 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.051607 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.051625 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.051638 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:08Z","lastTransitionTime":"2025-12-05T23:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.064560 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:08Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.078886 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:08Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.090604 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:08Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.101893 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:08Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.114178 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:08Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.130878 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ed109ca95328fcc458e818da95462a941b14b4a4ad494d73190e64ec494c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:08Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.147118 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:08Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.154446 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.154897 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.155224 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.155493 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.155724 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:08Z","lastTransitionTime":"2025-12-05T23:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.161968 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:08Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.175261 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:08Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.190697 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:08Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.202742 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:08Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.221378 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:08Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.238941 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:08Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.254812 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:08Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.259057 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.259107 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.259121 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.259142 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.259159 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:08Z","lastTransitionTime":"2025-12-05T23:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.270613 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:08Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.282617 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:08Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.294961 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:08Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.308791 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:08Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.329948 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2ffde2a6354a726878c82fab03640d219e889aa358efdd008839c042bf9357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:08Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.361786 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.361830 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.361842 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.361861 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.361872 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:08Z","lastTransitionTime":"2025-12-05T23:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.465089 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.465403 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.465584 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.465733 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.465911 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:08Z","lastTransitionTime":"2025-12-05T23:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.569515 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.569600 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.569661 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.569703 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.569726 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:08Z","lastTransitionTime":"2025-12-05T23:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.613232 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.613268 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.613352 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:08 crc kubenswrapper[4734]: E1205 23:20:08.613436 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:08 crc kubenswrapper[4734]: E1205 23:20:08.613559 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:08 crc kubenswrapper[4734]: E1205 23:20:08.613708 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.672991 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.673053 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.673063 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.673084 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.673096 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:08Z","lastTransitionTime":"2025-12-05T23:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.776635 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.776702 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.776715 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.776739 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.776754 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:08Z","lastTransitionTime":"2025-12-05T23:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.876797 4734 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.877464 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.879612 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.879679 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.879699 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.879725 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.879744 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:08Z","lastTransitionTime":"2025-12-05T23:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.908704 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.930379 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:08Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.950621 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:08Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.966893 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:08Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.983060 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.983114 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.983132 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.983189 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.983202 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:08Z","lastTransitionTime":"2025-12-05T23:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:08 crc kubenswrapper[4734]: I1205 23:20:08.983215 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:08Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.009488 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2ffde2a6354a726878c82fab03640d219e889aa358efdd008839c042bf9357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:09Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.025721 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:09Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.043169 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:09Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.060262 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:09Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.075847 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:09Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.086948 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.087031 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.087061 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.087094 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.087119 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:09Z","lastTransitionTime":"2025-12-05T23:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.090288 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:09Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.109264 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:09Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.133601 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ed109ca95328fcc458e818da95462a941b14b4a4ad494d73190e64ec494c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:09Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.151761 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:09Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.170121 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:09Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.190497 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.190593 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.190612 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.190637 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.190652 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:09Z","lastTransitionTime":"2025-12-05T23:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.293837 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.293903 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.293919 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.293941 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.293957 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:09Z","lastTransitionTime":"2025-12-05T23:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.397228 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.397315 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.397331 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.397358 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.397369 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:09Z","lastTransitionTime":"2025-12-05T23:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.501402 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.501446 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.501457 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.501475 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.501486 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:09Z","lastTransitionTime":"2025-12-05T23:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.604899 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.604962 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.604976 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.604998 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.605014 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:09Z","lastTransitionTime":"2025-12-05T23:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.632745 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:09Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.648199 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:09Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.667376 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2ffde2a6354a726878c82fab03640d219e889aa358efdd008839c042bf9357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:09Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.681239 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:09Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.705593 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:09Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.710596 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.710636 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.710648 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.710668 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.710681 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:09Z","lastTransitionTime":"2025-12-05T23:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.722630 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:09Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.753484 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:09Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.782393 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:09Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.799698 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:09Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.813712 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:09Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.815818 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.815847 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.815859 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.815875 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.815890 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:09Z","lastTransitionTime":"2025-12-05T23:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.836237 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ed109ca95328fcc458e818da95462a941b14b4a4ad494d73190e64ec494c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:09Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.858022 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:09Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.871721 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:09Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.881684 4734 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.886554 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:09Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.918670 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.918723 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.918741 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.918768 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:09 crc kubenswrapper[4734]: I1205 23:20:09.918783 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:09Z","lastTransitionTime":"2025-12-05T23:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.022834 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.022882 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.022894 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.022915 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.022927 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:10Z","lastTransitionTime":"2025-12-05T23:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.126089 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.126150 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.126162 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.126187 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.126200 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:10Z","lastTransitionTime":"2025-12-05T23:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.229157 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.229221 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.229235 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.229256 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.229270 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:10Z","lastTransitionTime":"2025-12-05T23:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.331787 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.331838 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.331849 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.331868 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.331883 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:10Z","lastTransitionTime":"2025-12-05T23:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.434389 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.434429 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.434459 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.434473 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.434485 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:10Z","lastTransitionTime":"2025-12-05T23:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.538051 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.538089 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.538098 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.538116 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.538127 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:10Z","lastTransitionTime":"2025-12-05T23:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.613619 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.613736 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.613798 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:10 crc kubenswrapper[4734]: E1205 23:20:10.614009 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:10 crc kubenswrapper[4734]: E1205 23:20:10.614641 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:10 crc kubenswrapper[4734]: E1205 23:20:10.614714 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.641453 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.641514 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.641554 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.641579 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.641592 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:10Z","lastTransitionTime":"2025-12-05T23:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.744184 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.744224 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.744237 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.744260 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.744274 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:10Z","lastTransitionTime":"2025-12-05T23:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.847377 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.847453 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.847477 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.847505 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.847560 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:10Z","lastTransitionTime":"2025-12-05T23:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.885741 4734 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.951112 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.951191 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.951210 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.951239 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:10 crc kubenswrapper[4734]: I1205 23:20:10.951261 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:10Z","lastTransitionTime":"2025-12-05T23:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.055462 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.055578 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.055598 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.055626 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.055647 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:11Z","lastTransitionTime":"2025-12-05T23:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.159174 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.159225 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.159247 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.159270 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.159284 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:11Z","lastTransitionTime":"2025-12-05T23:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.262673 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.262749 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.262787 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.262830 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.262856 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:11Z","lastTransitionTime":"2025-12-05T23:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.366800 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.366858 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.366876 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.366903 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.366922 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:11Z","lastTransitionTime":"2025-12-05T23:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.470585 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.470649 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.470663 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.470683 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.470701 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:11Z","lastTransitionTime":"2025-12-05T23:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.573755 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.573825 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.573846 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.573873 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.573891 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:11Z","lastTransitionTime":"2025-12-05T23:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.676765 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.676824 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.676843 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.676869 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.676886 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:11Z","lastTransitionTime":"2025-12-05T23:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.779726 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.779813 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.779835 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.779860 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.779880 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:11Z","lastTransitionTime":"2025-12-05T23:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.883628 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.883695 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.883711 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.883736 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.883754 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:11Z","lastTransitionTime":"2025-12-05T23:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.891229 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8bfg7_2927a376-2f69-4820-a222-b86f08ece55a/ovnkube-controller/0.log" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.894858 4734 generic.go:334] "Generic (PLEG): container finished" podID="2927a376-2f69-4820-a222-b86f08ece55a" containerID="8b2ffde2a6354a726878c82fab03640d219e889aa358efdd008839c042bf9357" exitCode=1 Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.894938 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" event={"ID":"2927a376-2f69-4820-a222-b86f08ece55a","Type":"ContainerDied","Data":"8b2ffde2a6354a726878c82fab03640d219e889aa358efdd008839c042bf9357"} Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.896015 4734 scope.go:117] "RemoveContainer" containerID="8b2ffde2a6354a726878c82fab03640d219e889aa358efdd008839c042bf9357" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.922704 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:11Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.945700 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:11Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.963952 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:11Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.983919 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:11Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.988795 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.988846 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.988860 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.988883 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:11 crc kubenswrapper[4734]: I1205 23:20:11.988957 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:11Z","lastTransitionTime":"2025-12-05T23:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.011762 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b2ffde2a6354a726878c82fab03640d219e889aa358efdd008839c042bf9357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2ffde2a6354a726878c82fab03640d219e889aa358efdd008839c042bf9357\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:11Z\\\",\\\"message\\\":\\\"94 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 23:20:10.554871 5994 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 23:20:10.555074 5994 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 23:20:10.555409 5994 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 23:20:10.555776 5994 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 23:20:10.556112 5994 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 23:20:10.556125 5994 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 23:20:10.556153 5994 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 23:20:10.556162 5994 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 23:20:10.556180 5994 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 23:20:10.556201 5994 factory.go:656] Stopping watch factory\\\\nI1205 23:20:10.556217 5994 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:12Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.027721 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:12Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.042852 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:12Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.066471 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ed109ca95328fcc458e818da95462a941b14b4a4ad494d73190e64ec494c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:12Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.080871 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:12Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.092157 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.092187 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.092202 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.092223 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.092238 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:12Z","lastTransitionTime":"2025-12-05T23:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.097416 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:12Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.111598 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:12Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.125739 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:12Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.140947 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:12Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.157758 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:12Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.198653 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.198785 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.198828 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.198856 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.198896 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:12Z","lastTransitionTime":"2025-12-05T23:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.302705 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.302769 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.302787 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.302814 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.302832 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:12Z","lastTransitionTime":"2025-12-05T23:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.405090 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.405136 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.405149 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.405167 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.405178 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:12Z","lastTransitionTime":"2025-12-05T23:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.507816 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.507860 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.507870 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.507888 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.507898 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:12Z","lastTransitionTime":"2025-12-05T23:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.611255 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.611327 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.611348 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.611378 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.611397 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:12Z","lastTransitionTime":"2025-12-05T23:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.613566 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:12 crc kubenswrapper[4734]: E1205 23:20:12.613755 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.613804 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.613839 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:12 crc kubenswrapper[4734]: E1205 23:20:12.613949 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:12 crc kubenswrapper[4734]: E1205 23:20:12.614047 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.714428 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.714483 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.714497 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.714515 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.714550 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:12Z","lastTransitionTime":"2025-12-05T23:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.818060 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.818100 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.818115 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.818134 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.818149 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:12Z","lastTransitionTime":"2025-12-05T23:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.877393 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.877438 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.877453 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.877474 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.877487 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:12Z","lastTransitionTime":"2025-12-05T23:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:12 crc kubenswrapper[4734]: E1205 23:20:12.891573 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:12Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.896239 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.896302 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.896323 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.896354 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.896373 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:12Z","lastTransitionTime":"2025-12-05T23:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.899302 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8bfg7_2927a376-2f69-4820-a222-b86f08ece55a/ovnkube-controller/0.log" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.902957 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" event={"ID":"2927a376-2f69-4820-a222-b86f08ece55a","Type":"ContainerStarted","Data":"06d97ee97fa08051bb6f3bb012336e973a802ad22ab9e5370bc01cee6db062ba"} Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.903140 4734 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 23:20:12 crc kubenswrapper[4734]: E1205 23:20:12.915486 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:12Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.920408 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.920458 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.920469 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.920491 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.920502 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:12Z","lastTransitionTime":"2025-12-05T23:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:12 crc kubenswrapper[4734]: I1205 23:20:12.922427 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:12Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.015429 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:12Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:13 crc kubenswrapper[4734]: E1205 23:20:13.015517 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:12Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.021320 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.021392 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.021427 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.021456 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.021481 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:13Z","lastTransitionTime":"2025-12-05T23:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.034003 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:13Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:13 crc kubenswrapper[4734]: E1205 23:20:13.036453 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:13Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.044990 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.045058 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.045073 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.045097 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.045111 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:13Z","lastTransitionTime":"2025-12-05T23:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.050932 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:13Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:13 crc kubenswrapper[4734]: E1205 23:20:13.059853 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:13Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:13 crc kubenswrapper[4734]: E1205 23:20:13.060246 4734 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.062581 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.062656 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.062669 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.062698 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.062711 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:13Z","lastTransitionTime":"2025-12-05T23:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.065515 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:13Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.088310 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06d97ee97fa08051bb6f3bb012336e973a802ad22ab9e5370bc01cee6db062ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2ffde2a6354a726878c82fab03640d219e889aa358efdd008839c042bf9357\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:11Z\\\",\\\"message\\\":\\\"94 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 23:20:10.554871 5994 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 23:20:10.555074 5994 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 23:20:10.555409 5994 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 23:20:10.555776 5994 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 23:20:10.556112 5994 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 23:20:10.556125 5994 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 23:20:10.556153 5994 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 23:20:10.556162 5994 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 23:20:10.556180 5994 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 23:20:10.556201 5994 factory.go:656] Stopping watch factory\\\\nI1205 23:20:10.556217 5994 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:13Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.105025 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:13Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.120848 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:13Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.137830 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:13Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.154945 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:13Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.165945 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.166015 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.166041 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.166072 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.166096 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:13Z","lastTransitionTime":"2025-12-05T23:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.169895 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:13Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.187975 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:13Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.214239 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ed109ca95328fcc458e818da95462a941b14b4a4ad494d73190e64ec494c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:13Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.240749 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:13Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.269149 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.269193 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.269206 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.269223 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.269240 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:13Z","lastTransitionTime":"2025-12-05T23:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.372555 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.372601 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.372612 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.372627 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.372639 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:13Z","lastTransitionTime":"2025-12-05T23:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.475877 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.475931 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.475944 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.475962 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.476018 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:13Z","lastTransitionTime":"2025-12-05T23:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.579569 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.579628 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.579641 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.579664 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.579679 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:13Z","lastTransitionTime":"2025-12-05T23:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.683567 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.683621 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.683638 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.683664 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.683683 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:13Z","lastTransitionTime":"2025-12-05T23:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.787253 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.787314 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.787333 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.787357 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.787375 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:13Z","lastTransitionTime":"2025-12-05T23:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.915620 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.915690 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.915713 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.915743 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.915796 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:13Z","lastTransitionTime":"2025-12-05T23:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.925772 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.925900 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:13 crc kubenswrapper[4734]: E1205 23:20:13.926003 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:20:29.92597228 +0000 UTC m=+50.609376586 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:20:13 crc kubenswrapper[4734]: E1205 23:20:13.926099 4734 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 23:20:13 crc kubenswrapper[4734]: E1205 23:20:13.926170 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 23:20:29.926152134 +0000 UTC m=+50.609556440 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.926169 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:13 crc kubenswrapper[4734]: E1205 23:20:13.926287 4734 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 23:20:13 crc kubenswrapper[4734]: E1205 23:20:13.926369 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 23:20:29.926350489 +0000 UTC m=+50.609754805 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.966970 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s"] Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.967819 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.971230 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.971245 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 05 23:20:13 crc kubenswrapper[4734]: I1205 23:20:13.990191 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:13Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.003904 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a85cf646-baec-45c1-a31e-97ce9e087c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wdk8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:14Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.018896 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.018951 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.018970 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.018998 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.019019 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:14Z","lastTransitionTime":"2025-12-05T23:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.022708 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:14Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.026966 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.027084 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:14 crc kubenswrapper[4734]: E1205 23:20:14.027180 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 23:20:14 crc kubenswrapper[4734]: E1205 23:20:14.027214 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 23:20:14 crc kubenswrapper[4734]: E1205 23:20:14.027231 4734 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:20:14 crc kubenswrapper[4734]: E1205 23:20:14.027242 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 23:20:14 crc kubenswrapper[4734]: E1205 23:20:14.027264 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 23:20:14 crc kubenswrapper[4734]: E1205 23:20:14.027282 4734 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:20:14 crc kubenswrapper[4734]: E1205 23:20:14.027299 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 23:20:30.0272775 +0000 UTC m=+50.710681776 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:20:14 crc kubenswrapper[4734]: E1205 23:20:14.027375 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 23:20:30.027354221 +0000 UTC m=+50.710758537 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.040267 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:14Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.052930 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:14Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.071231 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:14Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.097289 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06d97ee97fa08051bb6f3bb012336e973a802ad22ab9e5370bc01cee6db062ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2ffde2a6354a726878c82fab03640d219e889aa358efdd008839c042bf9357\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:11Z\\\",\\\"message\\\":\\\"94 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 23:20:10.554871 5994 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 23:20:10.555074 5994 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 23:20:10.555409 5994 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 23:20:10.555776 5994 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 23:20:10.556112 5994 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 23:20:10.556125 5994 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 23:20:10.556153 5994 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 23:20:10.556162 5994 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 23:20:10.556180 5994 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 23:20:10.556201 5994 factory.go:656] Stopping watch factory\\\\nI1205 23:20:10.556217 5994 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:14Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.122777 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.122877 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.122899 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.122929 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.122947 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:14Z","lastTransitionTime":"2025-12-05T23:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.125298 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ed109ca95328fcc458e818da95462a941b14b4a4ad494d73190e64ec494c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:14Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.127869 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a85cf646-baec-45c1-a31e-97ce9e087c69-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wdk8s\" (UID: \"a85cf646-baec-45c1-a31e-97ce9e087c69\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.127992 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a85cf646-baec-45c1-a31e-97ce9e087c69-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wdk8s\" (UID: \"a85cf646-baec-45c1-a31e-97ce9e087c69\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.128053 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqrvb\" (UniqueName: \"kubernetes.io/projected/a85cf646-baec-45c1-a31e-97ce9e087c69-kube-api-access-tqrvb\") pod \"ovnkube-control-plane-749d76644c-wdk8s\" (UID: \"a85cf646-baec-45c1-a31e-97ce9e087c69\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.128127 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a85cf646-baec-45c1-a31e-97ce9e087c69-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wdk8s\" (UID: \"a85cf646-baec-45c1-a31e-97ce9e087c69\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.141588 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:14Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.159963 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:14Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.174137 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:14Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.192400 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:14Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.206927 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:14Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.221014 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:14Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.225456 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.225502 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.225513 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.225549 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.225560 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:14Z","lastTransitionTime":"2025-12-05T23:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.229070 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a85cf646-baec-45c1-a31e-97ce9e087c69-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wdk8s\" (UID: \"a85cf646-baec-45c1-a31e-97ce9e087c69\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.229109 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a85cf646-baec-45c1-a31e-97ce9e087c69-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wdk8s\" (UID: \"a85cf646-baec-45c1-a31e-97ce9e087c69\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.229128 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqrvb\" (UniqueName: \"kubernetes.io/projected/a85cf646-baec-45c1-a31e-97ce9e087c69-kube-api-access-tqrvb\") pod \"ovnkube-control-plane-749d76644c-wdk8s\" (UID: \"a85cf646-baec-45c1-a31e-97ce9e087c69\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.229158 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a85cf646-baec-45c1-a31e-97ce9e087c69-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wdk8s\" (UID: \"a85cf646-baec-45c1-a31e-97ce9e087c69\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.229917 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a85cf646-baec-45c1-a31e-97ce9e087c69-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wdk8s\" (UID: \"a85cf646-baec-45c1-a31e-97ce9e087c69\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.229923 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a85cf646-baec-45c1-a31e-97ce9e087c69-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wdk8s\" (UID: \"a85cf646-baec-45c1-a31e-97ce9e087c69\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.236123 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:14Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.241297 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a85cf646-baec-45c1-a31e-97ce9e087c69-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wdk8s\" (UID: \"a85cf646-baec-45c1-a31e-97ce9e087c69\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.260002 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqrvb\" (UniqueName: \"kubernetes.io/projected/a85cf646-baec-45c1-a31e-97ce9e087c69-kube-api-access-tqrvb\") pod \"ovnkube-control-plane-749d76644c-wdk8s\" (UID: \"a85cf646-baec-45c1-a31e-97ce9e087c69\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.287755 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" Dec 05 23:20:14 crc kubenswrapper[4734]: W1205 23:20:14.309650 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda85cf646_baec_45c1_a31e_97ce9e087c69.slice/crio-695c15c6f158741fd3dca94f37f2fd496a99c7305600e23a0c9bebe828436e4f WatchSource:0}: Error finding container 695c15c6f158741fd3dca94f37f2fd496a99c7305600e23a0c9bebe828436e4f: Status 404 returned error can't find the container with id 695c15c6f158741fd3dca94f37f2fd496a99c7305600e23a0c9bebe828436e4f Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.328418 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.328478 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.328492 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.328545 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.328568 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:14Z","lastTransitionTime":"2025-12-05T23:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.433270 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.433340 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.433360 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.433389 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.433412 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:14Z","lastTransitionTime":"2025-12-05T23:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.536740 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.536838 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.536858 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.536889 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.536911 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:14Z","lastTransitionTime":"2025-12-05T23:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.613368 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.613414 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.613500 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:14 crc kubenswrapper[4734]: E1205 23:20:14.613624 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:14 crc kubenswrapper[4734]: E1205 23:20:14.613700 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:14 crc kubenswrapper[4734]: E1205 23:20:14.613876 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.640188 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.640247 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.640270 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.640296 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.640316 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:14Z","lastTransitionTime":"2025-12-05T23:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.729451 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-l6r6g"] Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.730143 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:14 crc kubenswrapper[4734]: E1205 23:20:14.730214 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.743359 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.743410 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.743425 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.743447 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.743461 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:14Z","lastTransitionTime":"2025-12-05T23:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.746238 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:14Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.763422 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a85cf646-baec-45c1-a31e-97ce9e087c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wdk8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:14Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.781695 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:14Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.797491 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:14Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.819015 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:14Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.837804 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/641af4fe-dd54-4118-8985-d37a03d64f79-metrics-certs\") pod \"network-metrics-daemon-l6r6g\" (UID: \"641af4fe-dd54-4118-8985-d37a03d64f79\") " pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.837869 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcvhz\" (UniqueName: \"kubernetes.io/projected/641af4fe-dd54-4118-8985-d37a03d64f79-kube-api-access-dcvhz\") pod \"network-metrics-daemon-l6r6g\" (UID: \"641af4fe-dd54-4118-8985-d37a03d64f79\") " pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.846425 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.846465 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.846481 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.846500 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.846514 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:14Z","lastTransitionTime":"2025-12-05T23:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.848190 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06d97ee97fa08051bb6f3bb012336e973a802ad22ab9e5370bc01cee6db062ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2ffde2a6354a726878c82fab03640d219e889aa358efdd008839c042bf9357\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:11Z\\\",\\\"message\\\":\\\"94 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 23:20:10.554871 5994 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 23:20:10.555074 5994 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 23:20:10.555409 5994 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 23:20:10.555776 5994 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 23:20:10.556112 5994 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 23:20:10.556125 5994 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 23:20:10.556153 5994 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 23:20:10.556162 5994 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 23:20:10.556180 5994 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 23:20:10.556201 5994 factory.go:656] Stopping watch factory\\\\nI1205 23:20:10.556217 5994 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:14Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.862404 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l6r6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"641af4fe-dd54-4118-8985-d37a03d64f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l6r6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:14Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.879699 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:14Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.898869 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:14Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.918632 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:14Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.924738 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8bfg7_2927a376-2f69-4820-a222-b86f08ece55a/ovnkube-controller/1.log" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.925590 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8bfg7_2927a376-2f69-4820-a222-b86f08ece55a/ovnkube-controller/0.log" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.931555 4734 generic.go:334] "Generic (PLEG): container finished" podID="2927a376-2f69-4820-a222-b86f08ece55a" containerID="06d97ee97fa08051bb6f3bb012336e973a802ad22ab9e5370bc01cee6db062ba" exitCode=1 Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.931669 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" event={"ID":"2927a376-2f69-4820-a222-b86f08ece55a","Type":"ContainerDied","Data":"06d97ee97fa08051bb6f3bb012336e973a802ad22ab9e5370bc01cee6db062ba"} Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.931756 4734 scope.go:117] "RemoveContainer" containerID="8b2ffde2a6354a726878c82fab03640d219e889aa358efdd008839c042bf9357" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.932822 4734 scope.go:117] "RemoveContainer" containerID="06d97ee97fa08051bb6f3bb012336e973a802ad22ab9e5370bc01cee6db062ba" Dec 05 23:20:14 crc kubenswrapper[4734]: E1205 23:20:14.933033 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8bfg7_openshift-ovn-kubernetes(2927a376-2f69-4820-a222-b86f08ece55a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" podUID="2927a376-2f69-4820-a222-b86f08ece55a" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.936575 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" event={"ID":"a85cf646-baec-45c1-a31e-97ce9e087c69","Type":"ContainerStarted","Data":"93325400e317291da2931220b981cce963abd9cf3cb36d1959f19d136c0d2134"} Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.937101 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" event={"ID":"a85cf646-baec-45c1-a31e-97ce9e087c69","Type":"ContainerStarted","Data":"44a4d2f938eb5aab362754086f82c0bb45b25e167e76d2dbe7192c92982ea9fc"} Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.937114 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" event={"ID":"a85cf646-baec-45c1-a31e-97ce9e087c69","Type":"ContainerStarted","Data":"695c15c6f158741fd3dca94f37f2fd496a99c7305600e23a0c9bebe828436e4f"} Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.937454 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:14Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.938354 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcvhz\" (UniqueName: \"kubernetes.io/projected/641af4fe-dd54-4118-8985-d37a03d64f79-kube-api-access-dcvhz\") pod \"network-metrics-daemon-l6r6g\" (UID: \"641af4fe-dd54-4118-8985-d37a03d64f79\") " pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.938424 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/641af4fe-dd54-4118-8985-d37a03d64f79-metrics-certs\") pod \"network-metrics-daemon-l6r6g\" (UID: \"641af4fe-dd54-4118-8985-d37a03d64f79\") " pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:14 crc kubenswrapper[4734]: E1205 23:20:14.938680 4734 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 23:20:14 crc kubenswrapper[4734]: E1205 23:20:14.938859 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/641af4fe-dd54-4118-8985-d37a03d64f79-metrics-certs podName:641af4fe-dd54-4118-8985-d37a03d64f79 nodeName:}" failed. No retries permitted until 2025-12-05 23:20:15.438800783 +0000 UTC m=+36.122205059 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/641af4fe-dd54-4118-8985-d37a03d64f79-metrics-certs") pod "network-metrics-daemon-l6r6g" (UID: "641af4fe-dd54-4118-8985-d37a03d64f79") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.951911 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.951978 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.951991 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.952011 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.952027 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:14Z","lastTransitionTime":"2025-12-05T23:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.959789 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:14Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.964911 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcvhz\" (UniqueName: \"kubernetes.io/projected/641af4fe-dd54-4118-8985-d37a03d64f79-kube-api-access-dcvhz\") pod \"network-metrics-daemon-l6r6g\" (UID: \"641af4fe-dd54-4118-8985-d37a03d64f79\") " pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.978332 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:14Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:14 crc kubenswrapper[4734]: I1205 23:20:14.990964 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:14Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.007005 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ed109ca95328fcc458e818da95462a941b14b4a4ad494d73190e64ec494c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:15Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.030030 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:15Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.051881 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:15Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.054125 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.054157 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.054168 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.054186 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.054198 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:15Z","lastTransitionTime":"2025-12-05T23:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.071923 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:15Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.086587 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:15Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.105298 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:15Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.117578 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:15Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.131055 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:15Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.152147 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ed109ca95328fcc458e818da95462a941b14b4a4ad494d73190e64ec494c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:15Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.156680 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.156724 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.156738 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.156755 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.156768 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:15Z","lastTransitionTime":"2025-12-05T23:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.172001 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:15Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.190698 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:15Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.202734 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a85cf646-baec-45c1-a31e-97ce9e087c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a4d2f938eb5aab362754086f82c0bb45b25e167e76d2dbe7192c92982ea9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93325400e317291da2931220b981cce963abd9cf3cb36d1959f19d136c0d2134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wdk8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:15Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.215247 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:15Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.226934 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:15Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.241620 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:15Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.259304 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.259359 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.259376 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.259402 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.259420 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:15Z","lastTransitionTime":"2025-12-05T23:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.266549 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06d97ee97fa08051bb6f3bb012336e973a802ad22ab9e5370bc01cee6db062ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2ffde2a6354a726878c82fab03640d219e889aa358efdd008839c042bf9357\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:11Z\\\",\\\"message\\\":\\\"94 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 23:20:10.554871 5994 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 23:20:10.555074 5994 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 23:20:10.555409 5994 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 23:20:10.555776 5994 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 23:20:10.556112 5994 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 23:20:10.556125 5994 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 23:20:10.556153 5994 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 23:20:10.556162 5994 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 23:20:10.556180 5994 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 23:20:10.556201 5994 factory.go:656] Stopping watch factory\\\\nI1205 23:20:10.556217 5994 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d97ee97fa08051bb6f3bb012336e973a802ad22ab9e5370bc01cee6db062ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\":12.919288 6130 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 23:20:12.919320 6130 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 23:20:12.919356 6130 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1205 23:20:12.919370 6130 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1205 23:20:12.919423 6130 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1205 23:20:12.919747 6130 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 23:20:12.919997 6130 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 23:20:12.920181 6130 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 23:20:12.920227 6130 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 23:20:12.920259 6130 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 23:20:12.920293 6130 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1205 23:20:12.920870 6130 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:15Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.281323 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l6r6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"641af4fe-dd54-4118-8985-d37a03d64f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l6r6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:15Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.301073 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:15Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.362767 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.362821 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.362834 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.362856 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.362872 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:15Z","lastTransitionTime":"2025-12-05T23:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.444315 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/641af4fe-dd54-4118-8985-d37a03d64f79-metrics-certs\") pod \"network-metrics-daemon-l6r6g\" (UID: \"641af4fe-dd54-4118-8985-d37a03d64f79\") " pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:15 crc kubenswrapper[4734]: E1205 23:20:15.444560 4734 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 23:20:15 crc kubenswrapper[4734]: E1205 23:20:15.444680 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/641af4fe-dd54-4118-8985-d37a03d64f79-metrics-certs podName:641af4fe-dd54-4118-8985-d37a03d64f79 nodeName:}" failed. No retries permitted until 2025-12-05 23:20:16.444647246 +0000 UTC m=+37.128051552 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/641af4fe-dd54-4118-8985-d37a03d64f79-metrics-certs") pod "network-metrics-daemon-l6r6g" (UID: "641af4fe-dd54-4118-8985-d37a03d64f79") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.466225 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.466289 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.466310 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.466336 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.466358 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:15Z","lastTransitionTime":"2025-12-05T23:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.569627 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.569678 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.569693 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.569712 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.569724 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:15Z","lastTransitionTime":"2025-12-05T23:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.672684 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.672725 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.672735 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.672753 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.672763 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:15Z","lastTransitionTime":"2025-12-05T23:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.776075 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.776140 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.776162 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.776193 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.776218 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:15Z","lastTransitionTime":"2025-12-05T23:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.879985 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.880060 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.880085 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.880112 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.880131 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:15Z","lastTransitionTime":"2025-12-05T23:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.945030 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8bfg7_2927a376-2f69-4820-a222-b86f08ece55a/ovnkube-controller/1.log" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.983565 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.983606 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.983622 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.983641 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:15 crc kubenswrapper[4734]: I1205 23:20:15.983655 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:15Z","lastTransitionTime":"2025-12-05T23:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.086757 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.086839 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.086863 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.086893 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.086917 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:16Z","lastTransitionTime":"2025-12-05T23:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.189519 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.189643 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.189666 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.189698 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.189721 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:16Z","lastTransitionTime":"2025-12-05T23:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.293031 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.293086 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.293102 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.293123 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.293138 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:16Z","lastTransitionTime":"2025-12-05T23:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.396471 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.396577 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.396598 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.396630 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.396657 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:16Z","lastTransitionTime":"2025-12-05T23:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.458446 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/641af4fe-dd54-4118-8985-d37a03d64f79-metrics-certs\") pod \"network-metrics-daemon-l6r6g\" (UID: \"641af4fe-dd54-4118-8985-d37a03d64f79\") " pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:16 crc kubenswrapper[4734]: E1205 23:20:16.458715 4734 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 23:20:16 crc kubenswrapper[4734]: E1205 23:20:16.458830 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/641af4fe-dd54-4118-8985-d37a03d64f79-metrics-certs podName:641af4fe-dd54-4118-8985-d37a03d64f79 nodeName:}" failed. No retries permitted until 2025-12-05 23:20:18.458805122 +0000 UTC m=+39.142209428 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/641af4fe-dd54-4118-8985-d37a03d64f79-metrics-certs") pod "network-metrics-daemon-l6r6g" (UID: "641af4fe-dd54-4118-8985-d37a03d64f79") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.500322 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.500410 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.500436 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.500464 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.500482 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:16Z","lastTransitionTime":"2025-12-05T23:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.603836 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.603899 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.603922 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.603951 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.603975 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:16Z","lastTransitionTime":"2025-12-05T23:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.613437 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.613473 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.613434 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.613588 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:16 crc kubenswrapper[4734]: E1205 23:20:16.613715 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:20:16 crc kubenswrapper[4734]: E1205 23:20:16.613825 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:16 crc kubenswrapper[4734]: E1205 23:20:16.613916 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:16 crc kubenswrapper[4734]: E1205 23:20:16.614002 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.706581 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.706671 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.706692 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.706718 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.706736 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:16Z","lastTransitionTime":"2025-12-05T23:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.810054 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.810116 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.810133 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.810157 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.810174 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:16Z","lastTransitionTime":"2025-12-05T23:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.913623 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.913719 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.913738 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.913766 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:16 crc kubenswrapper[4734]: I1205 23:20:16.913784 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:16Z","lastTransitionTime":"2025-12-05T23:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.018123 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.018255 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.018275 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.018304 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.018324 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:17Z","lastTransitionTime":"2025-12-05T23:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.121726 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.121817 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.121843 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.121878 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.121905 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:17Z","lastTransitionTime":"2025-12-05T23:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.225192 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.225263 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.225276 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.225293 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.225305 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:17Z","lastTransitionTime":"2025-12-05T23:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.328190 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.328245 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.328256 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.328275 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.328287 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:17Z","lastTransitionTime":"2025-12-05T23:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.430883 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.430925 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.430941 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.430963 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.430980 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:17Z","lastTransitionTime":"2025-12-05T23:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.534469 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.534570 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.534582 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.534603 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.534616 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:17Z","lastTransitionTime":"2025-12-05T23:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.638755 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.638809 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.638819 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.638850 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.638870 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:17Z","lastTransitionTime":"2025-12-05T23:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.742224 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.742283 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.742293 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.742313 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.742325 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:17Z","lastTransitionTime":"2025-12-05T23:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.846168 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.846261 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.846304 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.846327 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.846342 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:17Z","lastTransitionTime":"2025-12-05T23:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.950220 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.950301 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.950312 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.950338 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:17 crc kubenswrapper[4734]: I1205 23:20:17.950368 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:17Z","lastTransitionTime":"2025-12-05T23:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.054218 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.054306 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.054325 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.054358 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.054380 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:18Z","lastTransitionTime":"2025-12-05T23:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.157745 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.157828 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.157846 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.157873 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.157895 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:18Z","lastTransitionTime":"2025-12-05T23:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.261962 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.262037 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.262050 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.262073 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.262088 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:18Z","lastTransitionTime":"2025-12-05T23:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.365571 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.365621 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.365636 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.365670 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.365691 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:18Z","lastTransitionTime":"2025-12-05T23:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.469715 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.470281 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.470405 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.470537 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.470646 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:18Z","lastTransitionTime":"2025-12-05T23:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.480292 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/641af4fe-dd54-4118-8985-d37a03d64f79-metrics-certs\") pod \"network-metrics-daemon-l6r6g\" (UID: \"641af4fe-dd54-4118-8985-d37a03d64f79\") " pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:18 crc kubenswrapper[4734]: E1205 23:20:18.480421 4734 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 23:20:18 crc kubenswrapper[4734]: E1205 23:20:18.480471 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/641af4fe-dd54-4118-8985-d37a03d64f79-metrics-certs podName:641af4fe-dd54-4118-8985-d37a03d64f79 nodeName:}" failed. No retries permitted until 2025-12-05 23:20:22.480457293 +0000 UTC m=+43.163861569 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/641af4fe-dd54-4118-8985-d37a03d64f79-metrics-certs") pod "network-metrics-daemon-l6r6g" (UID: "641af4fe-dd54-4118-8985-d37a03d64f79") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.574014 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.574113 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.574135 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.574166 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.574189 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:18Z","lastTransitionTime":"2025-12-05T23:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.613861 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.613918 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.613944 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.614166 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:18 crc kubenswrapper[4734]: E1205 23:20:18.614163 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:20:18 crc kubenswrapper[4734]: E1205 23:20:18.614313 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:18 crc kubenswrapper[4734]: E1205 23:20:18.614622 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:18 crc kubenswrapper[4734]: E1205 23:20:18.614774 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.678648 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.678767 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.678862 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.678891 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.678951 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:18Z","lastTransitionTime":"2025-12-05T23:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.783139 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.783212 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.783232 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.783263 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.783285 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:18Z","lastTransitionTime":"2025-12-05T23:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.886895 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.887423 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.887701 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.887943 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.888103 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:18Z","lastTransitionTime":"2025-12-05T23:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.991358 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.991432 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.991456 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.991486 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:18 crc kubenswrapper[4734]: I1205 23:20:18.991510 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:18Z","lastTransitionTime":"2025-12-05T23:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.095107 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.095175 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.095185 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.095207 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.095221 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:19Z","lastTransitionTime":"2025-12-05T23:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.199468 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.199682 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.199708 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.199745 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.199768 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:19Z","lastTransitionTime":"2025-12-05T23:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.303176 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.303235 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.303247 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.303270 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.303283 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:19Z","lastTransitionTime":"2025-12-05T23:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.407000 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.407082 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.407096 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.407122 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.407136 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:19Z","lastTransitionTime":"2025-12-05T23:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.510100 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.510187 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.510221 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.510262 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.510292 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:19Z","lastTransitionTime":"2025-12-05T23:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.613103 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.613150 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.613167 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.613194 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.613212 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:19Z","lastTransitionTime":"2025-12-05T23:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.633622 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:19Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.652291 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:19Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.668745 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:19Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.698003 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:19Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.716219 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.716262 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.716277 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.716303 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.716319 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:19Z","lastTransitionTime":"2025-12-05T23:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.737242 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06d97ee97fa08051bb6f3bb012336e973a802ad22ab9e5370bc01cee6db062ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2ffde2a6354a726878c82fab03640d219e889aa358efdd008839c042bf9357\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:11Z\\\",\\\"message\\\":\\\"94 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 23:20:10.554871 5994 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 23:20:10.555074 5994 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 23:20:10.555409 5994 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 23:20:10.555776 5994 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 23:20:10.556112 5994 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 23:20:10.556125 5994 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 23:20:10.556153 5994 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 23:20:10.556162 5994 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 23:20:10.556180 5994 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 23:20:10.556201 5994 factory.go:656] Stopping watch factory\\\\nI1205 23:20:10.556217 5994 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d97ee97fa08051bb6f3bb012336e973a802ad22ab9e5370bc01cee6db062ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\":12.919288 6130 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 23:20:12.919320 6130 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 23:20:12.919356 6130 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1205 23:20:12.919370 6130 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1205 23:20:12.919423 6130 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1205 23:20:12.919747 6130 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 23:20:12.919997 6130 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 23:20:12.920181 6130 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 23:20:12.920227 6130 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 23:20:12.920259 6130 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 23:20:12.920293 6130 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1205 23:20:12.920870 6130 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:19Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.758295 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l6r6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"641af4fe-dd54-4118-8985-d37a03d64f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l6r6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:19Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.778982 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:19Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.801562 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ed109ca95328fcc458e818da95462a941b14b4a4ad494d73190e64ec494c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:19Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.819285 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.819337 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.819348 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.819369 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.819385 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:19Z","lastTransitionTime":"2025-12-05T23:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.822246 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:19Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.841293 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:19Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.866105 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:19Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.884620 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:19Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.902225 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:19Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.923218 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.923476 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.923610 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.923723 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.923848 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:19Z","lastTransitionTime":"2025-12-05T23:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.926110 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:19Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.942308 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:19Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:19 crc kubenswrapper[4734]: I1205 23:20:19.961284 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a85cf646-baec-45c1-a31e-97ce9e087c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a4d2f938eb5aab362754086f82c0bb45b25e167e76d2dbe7192c92982ea9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93325400e317291da2931220b981cce963abd9cf3cb36d1959f19d136c0d2134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wdk8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:19Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.027662 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.027727 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.027747 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.027774 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.027809 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:20Z","lastTransitionTime":"2025-12-05T23:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.131557 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.131619 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.131640 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.131672 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.131693 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:20Z","lastTransitionTime":"2025-12-05T23:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.235624 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.235719 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.235747 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.235785 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.235809 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:20Z","lastTransitionTime":"2025-12-05T23:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.338771 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.338828 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.338846 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.338872 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.338889 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:20Z","lastTransitionTime":"2025-12-05T23:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.442610 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.442695 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.442723 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.442754 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.442779 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:20Z","lastTransitionTime":"2025-12-05T23:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.545836 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.545898 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.545915 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.545938 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.545955 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:20Z","lastTransitionTime":"2025-12-05T23:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.613516 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.613597 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.613627 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.613564 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:20 crc kubenswrapper[4734]: E1205 23:20:20.613760 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:20:20 crc kubenswrapper[4734]: E1205 23:20:20.613839 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:20 crc kubenswrapper[4734]: E1205 23:20:20.614209 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:20 crc kubenswrapper[4734]: E1205 23:20:20.614304 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.649607 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.649657 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.649675 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.649700 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.649721 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:20Z","lastTransitionTime":"2025-12-05T23:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.752474 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.752598 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.752621 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.752650 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.752669 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:20Z","lastTransitionTime":"2025-12-05T23:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.855724 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.855792 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.855814 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.855840 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.855857 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:20Z","lastTransitionTime":"2025-12-05T23:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.959441 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.959518 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.959562 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.959589 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:20 crc kubenswrapper[4734]: I1205 23:20:20.959613 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:20Z","lastTransitionTime":"2025-12-05T23:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.062832 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.062906 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.062925 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.062951 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.062978 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:21Z","lastTransitionTime":"2025-12-05T23:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.167211 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.167268 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.167284 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.167307 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.167326 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:21Z","lastTransitionTime":"2025-12-05T23:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.270665 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.270741 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.270760 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.270790 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.270812 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:21Z","lastTransitionTime":"2025-12-05T23:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.373948 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.374013 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.374026 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.374047 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.374060 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:21Z","lastTransitionTime":"2025-12-05T23:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.477393 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.477474 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.477498 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.477576 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.477606 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:21Z","lastTransitionTime":"2025-12-05T23:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.581205 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.581278 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.581295 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.581321 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.581336 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:21Z","lastTransitionTime":"2025-12-05T23:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.684748 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.684818 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.684840 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.684867 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.684886 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:21Z","lastTransitionTime":"2025-12-05T23:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.787809 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.787893 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.787916 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.787951 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.787975 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:21Z","lastTransitionTime":"2025-12-05T23:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.891420 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.891508 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.891557 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.891582 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.891597 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:21Z","lastTransitionTime":"2025-12-05T23:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.994639 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.994704 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.994719 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.994743 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:21 crc kubenswrapper[4734]: I1205 23:20:21.994756 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:21Z","lastTransitionTime":"2025-12-05T23:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.099061 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.099372 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.099391 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.099439 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.099454 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:22Z","lastTransitionTime":"2025-12-05T23:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.203024 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.203101 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.203119 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.203150 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.203167 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:22Z","lastTransitionTime":"2025-12-05T23:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.306874 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.306961 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.306993 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.307034 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.307061 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:22Z","lastTransitionTime":"2025-12-05T23:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.410247 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.410322 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.410337 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.410358 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.410372 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:22Z","lastTransitionTime":"2025-12-05T23:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.514006 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.514070 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.514088 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.514114 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.514133 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:22Z","lastTransitionTime":"2025-12-05T23:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.530028 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/641af4fe-dd54-4118-8985-d37a03d64f79-metrics-certs\") pod \"network-metrics-daemon-l6r6g\" (UID: \"641af4fe-dd54-4118-8985-d37a03d64f79\") " pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:22 crc kubenswrapper[4734]: E1205 23:20:22.530300 4734 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 23:20:22 crc kubenswrapper[4734]: E1205 23:20:22.530438 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/641af4fe-dd54-4118-8985-d37a03d64f79-metrics-certs podName:641af4fe-dd54-4118-8985-d37a03d64f79 nodeName:}" failed. No retries permitted until 2025-12-05 23:20:30.530408164 +0000 UTC m=+51.213812430 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/641af4fe-dd54-4118-8985-d37a03d64f79-metrics-certs") pod "network-metrics-daemon-l6r6g" (UID: "641af4fe-dd54-4118-8985-d37a03d64f79") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.613507 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.613576 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.613595 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.613662 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:22 crc kubenswrapper[4734]: E1205 23:20:22.613873 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:22 crc kubenswrapper[4734]: E1205 23:20:22.614106 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:20:22 crc kubenswrapper[4734]: E1205 23:20:22.614382 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:22 crc kubenswrapper[4734]: E1205 23:20:22.614611 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.617032 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.617077 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.617090 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.617110 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.617124 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:22Z","lastTransitionTime":"2025-12-05T23:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.720450 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.720573 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.720597 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.720630 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.720649 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:22Z","lastTransitionTime":"2025-12-05T23:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.824093 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.824159 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.824177 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.824314 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.824337 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:22Z","lastTransitionTime":"2025-12-05T23:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.927337 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.927403 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.927420 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.927448 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:22 crc kubenswrapper[4734]: I1205 23:20:22.927470 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:22Z","lastTransitionTime":"2025-12-05T23:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.030631 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.030712 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.030733 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.030762 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.030786 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:23Z","lastTransitionTime":"2025-12-05T23:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.133778 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.133829 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.133838 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.133856 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.133868 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:23Z","lastTransitionTime":"2025-12-05T23:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.237744 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.237805 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.237815 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.237835 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.237847 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:23Z","lastTransitionTime":"2025-12-05T23:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.337082 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.337134 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.337167 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.337184 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.337196 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:23Z","lastTransitionTime":"2025-12-05T23:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:23 crc kubenswrapper[4734]: E1205 23:20:23.359361 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:23Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.364338 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.364379 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.364394 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.364412 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.364426 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:23Z","lastTransitionTime":"2025-12-05T23:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:23 crc kubenswrapper[4734]: E1205 23:20:23.380587 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:23Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.384760 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.384821 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.384840 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.384863 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.384881 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:23Z","lastTransitionTime":"2025-12-05T23:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:23 crc kubenswrapper[4734]: E1205 23:20:23.403124 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:23Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.408030 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.408079 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.408094 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.408115 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.408133 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:23Z","lastTransitionTime":"2025-12-05T23:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:23 crc kubenswrapper[4734]: E1205 23:20:23.422984 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:23Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.427016 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.427067 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.427079 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.427096 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.427108 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:23Z","lastTransitionTime":"2025-12-05T23:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:23 crc kubenswrapper[4734]: E1205 23:20:23.443746 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:23Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:23 crc kubenswrapper[4734]: E1205 23:20:23.443948 4734 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.445625 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.445663 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.445677 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.445694 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.445707 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:23Z","lastTransitionTime":"2025-12-05T23:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.549445 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.549560 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.549590 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.549621 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.549643 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:23Z","lastTransitionTime":"2025-12-05T23:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.652412 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.652490 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.652516 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.652583 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.652606 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:23Z","lastTransitionTime":"2025-12-05T23:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.755818 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.755882 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.755895 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.755919 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.755934 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:23Z","lastTransitionTime":"2025-12-05T23:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.859461 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.859598 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.859631 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.859663 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.859691 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:23Z","lastTransitionTime":"2025-12-05T23:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.963308 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.963363 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.963374 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.963395 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:23 crc kubenswrapper[4734]: I1205 23:20:23.963407 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:23Z","lastTransitionTime":"2025-12-05T23:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.066675 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.066836 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.066868 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.066895 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.066915 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:24Z","lastTransitionTime":"2025-12-05T23:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.169981 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.170053 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.170074 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.170106 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.170127 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:24Z","lastTransitionTime":"2025-12-05T23:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.273712 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.273871 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.273899 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.273928 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.273947 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:24Z","lastTransitionTime":"2025-12-05T23:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.377516 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.377677 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.377701 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.377727 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.377744 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:24Z","lastTransitionTime":"2025-12-05T23:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.481399 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.481480 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.481506 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.481581 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.481602 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:24Z","lastTransitionTime":"2025-12-05T23:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.585085 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.585159 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.585183 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.585216 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.585239 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:24Z","lastTransitionTime":"2025-12-05T23:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.613358 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:24 crc kubenswrapper[4734]: E1205 23:20:24.613515 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.613837 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.613878 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.613852 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:24 crc kubenswrapper[4734]: E1205 23:20:24.613973 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:24 crc kubenswrapper[4734]: E1205 23:20:24.614143 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:24 crc kubenswrapper[4734]: E1205 23:20:24.614308 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.688387 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.688470 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.688483 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.688507 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.688536 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:24Z","lastTransitionTime":"2025-12-05T23:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.791565 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.791641 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.791659 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.791690 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.791708 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:24Z","lastTransitionTime":"2025-12-05T23:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.894348 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.894420 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.894440 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.894468 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.894488 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:24Z","lastTransitionTime":"2025-12-05T23:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.997066 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.997126 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.997140 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.997165 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:24 crc kubenswrapper[4734]: I1205 23:20:24.997178 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:24Z","lastTransitionTime":"2025-12-05T23:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.101493 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.101567 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.101579 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.101603 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.101617 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:25Z","lastTransitionTime":"2025-12-05T23:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.204221 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.204284 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.204300 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.204324 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.204340 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:25Z","lastTransitionTime":"2025-12-05T23:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.307410 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.307484 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.307504 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.307573 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.307594 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:25Z","lastTransitionTime":"2025-12-05T23:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.410882 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.410955 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.410973 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.410999 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.411018 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:25Z","lastTransitionTime":"2025-12-05T23:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.514699 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.514759 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.514802 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.514851 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.514871 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:25Z","lastTransitionTime":"2025-12-05T23:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.618491 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.618773 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.619079 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.619219 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.619360 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:25Z","lastTransitionTime":"2025-12-05T23:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.723095 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.723374 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.723506 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.723717 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.723875 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:25Z","lastTransitionTime":"2025-12-05T23:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.827753 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.827829 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.827842 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.827871 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.827884 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:25Z","lastTransitionTime":"2025-12-05T23:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.931967 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.932039 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.932076 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.932107 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:25 crc kubenswrapper[4734]: I1205 23:20:25.932125 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:25Z","lastTransitionTime":"2025-12-05T23:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.035682 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.035779 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.035804 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.035837 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.035860 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:26Z","lastTransitionTime":"2025-12-05T23:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.139483 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.139581 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.139600 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.139628 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.139646 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:26Z","lastTransitionTime":"2025-12-05T23:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.242878 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.242955 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.242969 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.242993 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.243009 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:26Z","lastTransitionTime":"2025-12-05T23:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.346427 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.346473 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.346486 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.346509 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.346548 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:26Z","lastTransitionTime":"2025-12-05T23:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.450011 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.450115 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.450198 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.450228 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.450250 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:26Z","lastTransitionTime":"2025-12-05T23:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.553945 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.554014 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.554036 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.554062 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.554081 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:26Z","lastTransitionTime":"2025-12-05T23:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.613230 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.613285 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.613303 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.613239 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:26 crc kubenswrapper[4734]: E1205 23:20:26.613457 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:26 crc kubenswrapper[4734]: E1205 23:20:26.613622 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:20:26 crc kubenswrapper[4734]: E1205 23:20:26.613886 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:26 crc kubenswrapper[4734]: E1205 23:20:26.614127 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.657619 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.657680 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.657698 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.657726 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.657745 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:26Z","lastTransitionTime":"2025-12-05T23:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.761378 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.761437 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.761450 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.761470 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.761481 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:26Z","lastTransitionTime":"2025-12-05T23:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.864218 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.864276 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.864296 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.864319 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.864336 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:26Z","lastTransitionTime":"2025-12-05T23:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.968189 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.968263 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.968287 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.968332 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:26 crc kubenswrapper[4734]: I1205 23:20:26.968350 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:26Z","lastTransitionTime":"2025-12-05T23:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.070779 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.070834 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.070851 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.070875 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.070891 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:27Z","lastTransitionTime":"2025-12-05T23:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.180487 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.180591 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.180613 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.180642 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.180662 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:27Z","lastTransitionTime":"2025-12-05T23:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.253945 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.267467 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.277402 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:27Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.284094 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.284169 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.284188 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.284217 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.284237 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:27Z","lastTransitionTime":"2025-12-05T23:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.296958 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:27Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.315731 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:27Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.340621 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06d97ee97fa08051bb6f3bb012336e973a802ad22ab9e5370bc01cee6db062ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b2ffde2a6354a726878c82fab03640d219e889aa358efdd008839c042bf9357\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:11Z\\\",\\\"message\\\":\\\"94 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 23:20:10.554871 5994 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 23:20:10.555074 5994 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 23:20:10.555409 5994 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1205 23:20:10.555776 5994 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1205 23:20:10.556112 5994 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 23:20:10.556125 5994 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 23:20:10.556153 5994 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 23:20:10.556162 5994 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 23:20:10.556180 5994 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 23:20:10.556201 5994 factory.go:656] Stopping watch factory\\\\nI1205 23:20:10.556217 5994 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d97ee97fa08051bb6f3bb012336e973a802ad22ab9e5370bc01cee6db062ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\":12.919288 6130 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 23:20:12.919320 6130 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 23:20:12.919356 6130 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1205 23:20:12.919370 6130 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1205 23:20:12.919423 6130 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1205 23:20:12.919747 6130 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 23:20:12.919997 6130 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 23:20:12.920181 6130 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 23:20:12.920227 6130 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 23:20:12.920259 6130 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 23:20:12.920293 6130 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1205 23:20:12.920870 6130 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:27Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.362634 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l6r6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"641af4fe-dd54-4118-8985-d37a03d64f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l6r6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:27Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.387776 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.387871 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.387890 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.387920 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.387938 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:27Z","lastTransitionTime":"2025-12-05T23:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.387949 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:27Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.406692 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:27Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.424204 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:27Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.443629 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:27Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.461022 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:27Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.481080 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:27Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.492177 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.492238 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.492256 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.492283 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.492300 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:27Z","lastTransitionTime":"2025-12-05T23:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.499907 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:27Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.520425 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ed109ca95328fcc458e818da95462a941b14b4a4ad494d73190e64ec494c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:27Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.538229 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:27Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.558559 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:27Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.578067 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a85cf646-baec-45c1-a31e-97ce9e087c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a4d2f938eb5aab362754086f82c0bb45b25e167e76d2dbe7192c92982ea9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93325400e317291da2931220b981cce963abd9cf3cb36d1959f19d136c0d2134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wdk8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:27Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.595212 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.595272 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.595287 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.595307 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.595320 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:27Z","lastTransitionTime":"2025-12-05T23:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.698317 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.698383 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.698400 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.698432 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.698458 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:27Z","lastTransitionTime":"2025-12-05T23:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.801562 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.801634 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.801653 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.801681 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.801700 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:27Z","lastTransitionTime":"2025-12-05T23:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.904192 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.904247 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.904263 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.904290 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:27 crc kubenswrapper[4734]: I1205 23:20:27.904307 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:27Z","lastTransitionTime":"2025-12-05T23:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.006463 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.006513 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.006539 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.006559 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.006569 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:28Z","lastTransitionTime":"2025-12-05T23:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.109848 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.109916 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.109931 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.109957 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.109976 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:28Z","lastTransitionTime":"2025-12-05T23:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.212283 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.212337 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.212348 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.212367 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.212384 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:28Z","lastTransitionTime":"2025-12-05T23:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.316330 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.316390 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.316403 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.316425 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.316440 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:28Z","lastTransitionTime":"2025-12-05T23:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.419577 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.419643 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.419652 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.419672 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.419683 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:28Z","lastTransitionTime":"2025-12-05T23:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.522418 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.522488 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.522508 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.522570 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.522591 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:28Z","lastTransitionTime":"2025-12-05T23:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.613269 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.613362 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.613379 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.613510 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:28 crc kubenswrapper[4734]: E1205 23:20:28.613642 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:20:28 crc kubenswrapper[4734]: E1205 23:20:28.614207 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:28 crc kubenswrapper[4734]: E1205 23:20:28.614343 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:28 crc kubenswrapper[4734]: E1205 23:20:28.614468 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.614516 4734 scope.go:117] "RemoveContainer" containerID="06d97ee97fa08051bb6f3bb012336e973a802ad22ab9e5370bc01cee6db062ba" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.624343 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.624394 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.624408 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.624433 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.624450 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:28Z","lastTransitionTime":"2025-12-05T23:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.635099 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:28Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.654279 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a85cf646-baec-45c1-a31e-97ce9e087c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a4d2f938eb5aab362754086f82c0bb45b25e167e76d2dbe7192c92982ea9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93325400e317291da2931220b981cce963abd9cf3cb36d1959f19d136c0d2134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wdk8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:28Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.671749 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:28Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.685516 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:28Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.707689 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:28Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.728003 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.728058 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.728073 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.728097 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.728112 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:28Z","lastTransitionTime":"2025-12-05T23:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.750701 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:28Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.775051 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06d97ee97fa08051bb6f3bb012336e973a802ad22ab9e5370bc01cee6db062ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d97ee97fa08051bb6f3bb012336e973a802ad22ab9e5370bc01cee6db062ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\":12.919288 6130 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 23:20:12.919320 6130 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 23:20:12.919356 6130 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1205 23:20:12.919370 6130 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1205 23:20:12.919423 6130 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1205 23:20:12.919747 6130 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 23:20:12.919997 6130 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 23:20:12.920181 6130 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 23:20:12.920227 6130 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 23:20:12.920259 6130 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 23:20:12.920293 6130 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1205 23:20:12.920870 6130 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8bfg7_openshift-ovn-kubernetes(2927a376-2f69-4820-a222-b86f08ece55a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:28Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.787049 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l6r6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"641af4fe-dd54-4118-8985-d37a03d64f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l6r6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:28Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.799847 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:28Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.817045 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ed109ca95328fcc458e818da95462a941b14b4a4ad494d73190e64ec494c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:28Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.831807 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.831879 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.831899 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.831960 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.831981 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:28Z","lastTransitionTime":"2025-12-05T23:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.835127 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:28Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.855170 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:28Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.869739 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:28Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.889597 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:28Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.904621 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:28Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.926698 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:28Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.935840 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.935900 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.935920 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.935950 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.935969 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:28Z","lastTransitionTime":"2025-12-05T23:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:28 crc kubenswrapper[4734]: I1205 23:20:28.944345 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b089eaa-85b7-420d-914f-b053257be3c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32d72232eb5162100a1a381e51548864fc732ff00fd26239351ec294328fc7fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9676ebc0e731c50baebbab917a9dc814ceea006a370980021eaeb8bf822825b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be62799b986e89e6324a37ffed14cfc15d4fa6efec043e842534075da2b7547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff9c5cbf877fad8c2d4155cab3be27491de84cf4b7f3476f60a02de39936ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff9c5cbf877fad8c2d4155cab3be27491de84cf4b7f3476f60a02de39936ab51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:28Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.002709 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8bfg7_2927a376-2f69-4820-a222-b86f08ece55a/ovnkube-controller/1.log" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.005912 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" event={"ID":"2927a376-2f69-4820-a222-b86f08ece55a","Type":"ContainerStarted","Data":"ccde57f4cb8d41050120cab8e9d3de18cee5141f9f3ae7bd5abf452b06c74e8c"} Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.006062 4734 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.022667 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.036159 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a85cf646-baec-45c1-a31e-97ce9e087c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a4d2f938eb5aab362754086f82c0bb45b25e167e76d2dbe7192c92982ea9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93325400e317291da2931220b981cce963abd9cf3cb36d1959f19d136c0d2134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wdk8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.039014 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.039048 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.039058 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.039077 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.039090 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:29Z","lastTransitionTime":"2025-12-05T23:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.048991 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.062328 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.074979 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.093982 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccde57f4cb8d41050120cab8e9d3de18cee5141f9f3ae7bd5abf452b06c74e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d97ee97fa08051bb6f3bb012336e973a802ad22ab9e5370bc01cee6db062ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\":12.919288 6130 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 23:20:12.919320 6130 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 23:20:12.919356 6130 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1205 23:20:12.919370 6130 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1205 23:20:12.919423 6130 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1205 23:20:12.919747 6130 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 23:20:12.919997 6130 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 23:20:12.920181 6130 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 23:20:12.920227 6130 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 23:20:12.920259 6130 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 23:20:12.920293 6130 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1205 23:20:12.920870 6130 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.116281 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l6r6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"641af4fe-dd54-4118-8985-d37a03d64f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l6r6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.131320 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.141518 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.141569 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.141581 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.141602 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.141627 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:29Z","lastTransitionTime":"2025-12-05T23:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.150063 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.163880 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.176895 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.188718 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.199929 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.213771 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.232640 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ed109ca95328fcc458e818da95462a941b14b4a4ad494d73190e64ec494c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.244772 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.244820 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.244830 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.244854 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.244866 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:29Z","lastTransitionTime":"2025-12-05T23:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.250972 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b089eaa-85b7-420d-914f-b053257be3c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32d72232eb5162100a1a381e51548864fc732ff00fd26239351ec294328fc7fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9676ebc0e731c50baebbab917a9dc814ceea006a370980021eaeb8bf822825b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be62799b986e89e6324a37ffed14cfc15d4fa6efec043e842534075da2b7547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff9c5cbf877fad8c2d4155cab3be27491de84cf4b7f3476f60a02de39936ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff9c5cbf877fad8c2d4155cab3be27491de84cf4b7f3476f60a02de39936ab51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.264644 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.348310 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.348371 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.348384 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.348403 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.348419 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:29Z","lastTransitionTime":"2025-12-05T23:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.451795 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.451850 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.451866 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.451888 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.451902 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:29Z","lastTransitionTime":"2025-12-05T23:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.554965 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.555017 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.555037 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.555065 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.555083 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:29Z","lastTransitionTime":"2025-12-05T23:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.631776 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.644104 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a85cf646-baec-45c1-a31e-97ce9e087c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a4d2f938eb5aab362754086f82c0bb45b25e167e76d2dbe7192c92982ea9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93325400e317291da2931220b981cce963abd9cf3cb36d1959f19d136c0d2134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wdk8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.657863 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.657914 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.657927 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.657950 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.657963 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:29Z","lastTransitionTime":"2025-12-05T23:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.668639 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccde57f4cb8d41050120cab8e9d3de18cee5141f9f3ae7bd5abf452b06c74e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d97ee97fa08051bb6f3bb012336e973a802ad22ab9e5370bc01cee6db062ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\":12.919288 6130 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 23:20:12.919320 6130 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 23:20:12.919356 6130 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1205 23:20:12.919370 6130 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1205 23:20:12.919423 6130 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1205 23:20:12.919747 6130 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 23:20:12.919997 6130 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 23:20:12.920181 6130 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 23:20:12.920227 6130 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 23:20:12.920259 6130 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 23:20:12.920293 6130 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1205 23:20:12.920870 6130 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.687087 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l6r6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"641af4fe-dd54-4118-8985-d37a03d64f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l6r6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.718585 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.740349 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.753005 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.761113 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.761148 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.761159 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.761178 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.761192 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:29Z","lastTransitionTime":"2025-12-05T23:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.769016 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.782790 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.793730 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.805346 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.818863 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ed109ca95328fcc458e818da95462a941b14b4a4ad494d73190e64ec494c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.831712 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.846157 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.859203 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.863762 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.863818 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.863831 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.863852 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.863867 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:29Z","lastTransitionTime":"2025-12-05T23:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.873324 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.885193 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b089eaa-85b7-420d-914f-b053257be3c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32d72232eb5162100a1a381e51548864fc732ff00fd26239351ec294328fc7fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9676ebc0e731c50baebbab917a9dc814ceea006a370980021eaeb8bf822825b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be62799b986e89e6324a37ffed14cfc15d4fa6efec043e842534075da2b7547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff9c5cbf877fad8c2d4155cab3be27491de84cf4b7f3476f60a02de39936ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff9c5cbf877fad8c2d4155cab3be27491de84cf4b7f3476f60a02de39936ab51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:29Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.966603 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.966643 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.966660 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.966685 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:29 crc kubenswrapper[4734]: I1205 23:20:29.966701 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:29Z","lastTransitionTime":"2025-12-05T23:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.017274 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.017486 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:30 crc kubenswrapper[4734]: E1205 23:20:30.017606 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:21:02.017560368 +0000 UTC m=+82.700964664 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:20:30 crc kubenswrapper[4734]: E1205 23:20:30.017659 4734 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.017696 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:30 crc kubenswrapper[4734]: E1205 23:20:30.017744 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 23:21:02.017718061 +0000 UTC m=+82.701122487 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 23:20:30 crc kubenswrapper[4734]: E1205 23:20:30.017871 4734 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 23:20:30 crc kubenswrapper[4734]: E1205 23:20:30.017920 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 23:21:02.017908387 +0000 UTC m=+82.701312653 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.069867 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.069912 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.069924 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.069944 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.069960 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:30Z","lastTransitionTime":"2025-12-05T23:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.119100 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.119198 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:30 crc kubenswrapper[4734]: E1205 23:20:30.119418 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 23:20:30 crc kubenswrapper[4734]: E1205 23:20:30.119444 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 23:20:30 crc kubenswrapper[4734]: E1205 23:20:30.119440 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 23:20:30 crc kubenswrapper[4734]: E1205 23:20:30.119502 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 23:20:30 crc kubenswrapper[4734]: E1205 23:20:30.119462 4734 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:20:30 crc kubenswrapper[4734]: E1205 23:20:30.119521 4734 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:20:30 crc kubenswrapper[4734]: E1205 23:20:30.119650 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 23:21:02.119623366 +0000 UTC m=+82.803027642 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:20:30 crc kubenswrapper[4734]: E1205 23:20:30.119670 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 23:21:02.119663147 +0000 UTC m=+82.803067423 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.173189 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.173237 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.173246 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.173270 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.173281 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:30Z","lastTransitionTime":"2025-12-05T23:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.277256 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.277347 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.277374 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.277402 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.277422 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:30Z","lastTransitionTime":"2025-12-05T23:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.381094 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.381167 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.381190 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.381226 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.381251 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:30Z","lastTransitionTime":"2025-12-05T23:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.484416 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.484486 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.484505 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.484568 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.484597 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:30Z","lastTransitionTime":"2025-12-05T23:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.589466 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.589549 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.589562 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.589586 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.589601 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:30Z","lastTransitionTime":"2025-12-05T23:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.613210 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.613258 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.613290 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:30 crc kubenswrapper[4734]: E1205 23:20:30.613395 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.613235 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:30 crc kubenswrapper[4734]: E1205 23:20:30.613518 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:30 crc kubenswrapper[4734]: E1205 23:20:30.613776 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:20:30 crc kubenswrapper[4734]: E1205 23:20:30.614038 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.625969 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/641af4fe-dd54-4118-8985-d37a03d64f79-metrics-certs\") pod \"network-metrics-daemon-l6r6g\" (UID: \"641af4fe-dd54-4118-8985-d37a03d64f79\") " pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:30 crc kubenswrapper[4734]: E1205 23:20:30.626154 4734 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 23:20:30 crc kubenswrapper[4734]: E1205 23:20:30.626233 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/641af4fe-dd54-4118-8985-d37a03d64f79-metrics-certs podName:641af4fe-dd54-4118-8985-d37a03d64f79 nodeName:}" failed. No retries permitted until 2025-12-05 23:20:46.626205956 +0000 UTC m=+67.309610232 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/641af4fe-dd54-4118-8985-d37a03d64f79-metrics-certs") pod "network-metrics-daemon-l6r6g" (UID: "641af4fe-dd54-4118-8985-d37a03d64f79") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.693518 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.693594 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.693603 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.693625 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.693640 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:30Z","lastTransitionTime":"2025-12-05T23:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.797149 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.797229 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.797255 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.797284 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.797310 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:30Z","lastTransitionTime":"2025-12-05T23:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.900017 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.900107 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.900122 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.900146 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:30 crc kubenswrapper[4734]: I1205 23:20:30.900164 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:30Z","lastTransitionTime":"2025-12-05T23:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.003185 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.003239 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.003251 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.003276 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.003288 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:31Z","lastTransitionTime":"2025-12-05T23:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.014838 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8bfg7_2927a376-2f69-4820-a222-b86f08ece55a/ovnkube-controller/2.log" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.015675 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8bfg7_2927a376-2f69-4820-a222-b86f08ece55a/ovnkube-controller/1.log" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.019476 4734 generic.go:334] "Generic (PLEG): container finished" podID="2927a376-2f69-4820-a222-b86f08ece55a" containerID="ccde57f4cb8d41050120cab8e9d3de18cee5141f9f3ae7bd5abf452b06c74e8c" exitCode=1 Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.019571 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" event={"ID":"2927a376-2f69-4820-a222-b86f08ece55a","Type":"ContainerDied","Data":"ccde57f4cb8d41050120cab8e9d3de18cee5141f9f3ae7bd5abf452b06c74e8c"} Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.019657 4734 scope.go:117] "RemoveContainer" containerID="06d97ee97fa08051bb6f3bb012336e973a802ad22ab9e5370bc01cee6db062ba" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.021647 4734 scope.go:117] "RemoveContainer" containerID="ccde57f4cb8d41050120cab8e9d3de18cee5141f9f3ae7bd5abf452b06c74e8c" Dec 05 23:20:31 crc kubenswrapper[4734]: E1205 23:20:31.022135 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8bfg7_openshift-ovn-kubernetes(2927a376-2f69-4820-a222-b86f08ece55a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" podUID="2927a376-2f69-4820-a222-b86f08ece55a" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.044946 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:31Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.075761 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccde57f4cb8d41050120cab8e9d3de18cee5141f9f3ae7bd5abf452b06c74e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06d97ee97fa08051bb6f3bb012336e973a802ad22ab9e5370bc01cee6db062ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\":12.919288 6130 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 23:20:12.919320 6130 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 23:20:12.919356 6130 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1205 23:20:12.919370 6130 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1205 23:20:12.919423 6130 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1205 23:20:12.919747 6130 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 23:20:12.919997 6130 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 23:20:12.920181 6130 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 23:20:12.920227 6130 handler.go:208] Removed *v1.Node event handler 7\\\\nI1205 23:20:12.920259 6130 handler.go:208] Removed *v1.Node event handler 2\\\\nI1205 23:20:12.920293 6130 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1205 23:20:12.920870 6130 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccde57f4cb8d41050120cab8e9d3de18cee5141f9f3ae7bd5abf452b06c74e8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:30Z\\\",\\\"message\\\":\\\"id == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 23:20:29.708321 6322 services_controller.go:445] Built service openshift-dns-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF1205 23:20:29.708749 6322 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:31Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.089836 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l6r6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"641af4fe-dd54-4118-8985-d37a03d64f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l6r6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:31Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.106423 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.106479 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.106493 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.106513 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.106541 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:31Z","lastTransitionTime":"2025-12-05T23:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.109967 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:31Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.131781 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:31Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.143385 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:31Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.158867 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:31Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.175880 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:31Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.189134 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:31Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.205420 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:31Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.210406 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.210477 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.210490 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.210516 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.210550 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:31Z","lastTransitionTime":"2025-12-05T23:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.222831 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ed109ca95328fcc458e818da95462a941b14b4a4ad494d73190e64ec494c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:31Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.237587 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:31Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.251450 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:31Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.272127 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:31Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.289117 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b089eaa-85b7-420d-914f-b053257be3c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32d72232eb5162100a1a381e51548864fc732ff00fd26239351ec294328fc7fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9676ebc0e731c50baebbab917a9dc814ceea006a370980021eaeb8bf822825b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be62799b986e89e6324a37ffed14cfc15d4fa6efec043e842534075da2b7547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff9c5cbf877fad8c2d4155cab3be27491de84cf4b7f3476f60a02de39936ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff9c5cbf877fad8c2d4155cab3be27491de84cf4b7f3476f60a02de39936ab51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:31Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.304471 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:31Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.313826 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.313882 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.313897 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.313919 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.313933 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:31Z","lastTransitionTime":"2025-12-05T23:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.322664 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a85cf646-baec-45c1-a31e-97ce9e087c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a4d2f938eb5aab362754086f82c0bb45b25e167e76d2dbe7192c92982ea9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93325400e317291da2931220b981cce963abd9cf3cb36d1959f19d136c0d2134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wdk8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:31Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.417569 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.417633 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.417647 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.417675 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.417689 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:31Z","lastTransitionTime":"2025-12-05T23:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.473593 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.520597 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.520663 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.520681 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.520713 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.520734 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:31Z","lastTransitionTime":"2025-12-05T23:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.623194 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.623258 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.623278 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.623301 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.623321 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:31Z","lastTransitionTime":"2025-12-05T23:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.726736 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.726792 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.726811 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.726841 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.726858 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:31Z","lastTransitionTime":"2025-12-05T23:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.829186 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.829292 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.829312 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.829346 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.829366 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:31Z","lastTransitionTime":"2025-12-05T23:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.932167 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.932214 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.932233 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.932256 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:31 crc kubenswrapper[4734]: I1205 23:20:31.932270 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:31Z","lastTransitionTime":"2025-12-05T23:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.026518 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8bfg7_2927a376-2f69-4820-a222-b86f08ece55a/ovnkube-controller/2.log" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.030660 4734 scope.go:117] "RemoveContainer" containerID="ccde57f4cb8d41050120cab8e9d3de18cee5141f9f3ae7bd5abf452b06c74e8c" Dec 05 23:20:32 crc kubenswrapper[4734]: E1205 23:20:32.031056 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8bfg7_openshift-ovn-kubernetes(2927a376-2f69-4820-a222-b86f08ece55a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" podUID="2927a376-2f69-4820-a222-b86f08ece55a" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.035330 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.035357 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.035369 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.035387 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.035398 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:32Z","lastTransitionTime":"2025-12-05T23:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.051776 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:32Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.071147 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ed109ca95328fcc458e818da95462a941b14b4a4ad494d73190e64ec494c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:32Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.090055 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:32Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.109583 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:32Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.130121 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:32Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.138807 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.138873 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.138894 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.138924 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.138945 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:32Z","lastTransitionTime":"2025-12-05T23:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.149912 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:32Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.164542 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:32Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.186503 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:32Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.205322 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b089eaa-85b7-420d-914f-b053257be3c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32d72232eb5162100a1a381e51548864fc732ff00fd26239351ec294328fc7fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9676ebc0e731c50baebbab917a9dc814ceea006a370980021eaeb8bf822825b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be62799b986e89e6324a37ffed14cfc15d4fa6efec043e842534075da2b7547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff9c5cbf877fad8c2d4155cab3be27491de84cf4b7f3476f60a02de39936ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff9c5cbf877fad8c2d4155cab3be27491de84cf4b7f3476f60a02de39936ab51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:32Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.226650 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:32Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.242453 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.242522 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.242585 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.242627 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.242669 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:32Z","lastTransitionTime":"2025-12-05T23:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.242710 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a85cf646-baec-45c1-a31e-97ce9e087c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a4d2f938eb5aab362754086f82c0bb45b25e167e76d2dbe7192c92982ea9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93325400e317291da2931220b981cce963abd9cf3cb36d1959f19d136c0d2134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wdk8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:32Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.262109 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:32Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.284070 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:32Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.302362 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:32Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.321238 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:32Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.346509 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.346574 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.346587 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.346608 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.346621 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:32Z","lastTransitionTime":"2025-12-05T23:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.354916 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccde57f4cb8d41050120cab8e9d3de18cee5141f9f3ae7bd5abf452b06c74e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccde57f4cb8d41050120cab8e9d3de18cee5141f9f3ae7bd5abf452b06c74e8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:30Z\\\",\\\"message\\\":\\\"id == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 23:20:29.708321 6322 services_controller.go:445] Built service openshift-dns-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF1205 23:20:29.708749 6322 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8bfg7_openshift-ovn-kubernetes(2927a376-2f69-4820-a222-b86f08ece55a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:32Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.373734 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l6r6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"641af4fe-dd54-4118-8985-d37a03d64f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l6r6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:32Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.449665 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.449716 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.449726 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.449748 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.449763 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:32Z","lastTransitionTime":"2025-12-05T23:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.555362 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.555412 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.555425 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.555445 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.555458 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:32Z","lastTransitionTime":"2025-12-05T23:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.613039 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.613047 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:32 crc kubenswrapper[4734]: E1205 23:20:32.613210 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.613073 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.613047 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:32 crc kubenswrapper[4734]: E1205 23:20:32.613312 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:32 crc kubenswrapper[4734]: E1205 23:20:32.613306 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:32 crc kubenswrapper[4734]: E1205 23:20:32.613398 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.658238 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.658287 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.658297 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.658317 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.658329 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:32Z","lastTransitionTime":"2025-12-05T23:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.760986 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.761052 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.761063 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.761080 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.761090 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:32Z","lastTransitionTime":"2025-12-05T23:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.864458 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.864654 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.864671 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.864702 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.864720 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:32Z","lastTransitionTime":"2025-12-05T23:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.967290 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.967352 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.967363 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.967384 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:32 crc kubenswrapper[4734]: I1205 23:20:32.967397 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:32Z","lastTransitionTime":"2025-12-05T23:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.070205 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.070252 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.070265 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.070285 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.070298 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:33Z","lastTransitionTime":"2025-12-05T23:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.173424 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.173468 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.173479 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.173498 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.173509 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:33Z","lastTransitionTime":"2025-12-05T23:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.276740 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.276818 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.276837 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.276865 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.276885 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:33Z","lastTransitionTime":"2025-12-05T23:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.379728 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.379807 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.379821 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.379847 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.379863 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:33Z","lastTransitionTime":"2025-12-05T23:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.458034 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.458091 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.458101 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.458124 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.458135 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:33Z","lastTransitionTime":"2025-12-05T23:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:33 crc kubenswrapper[4734]: E1205 23:20:33.483459 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:33Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.489382 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.489437 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.489449 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.489473 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.489485 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:33Z","lastTransitionTime":"2025-12-05T23:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:33 crc kubenswrapper[4734]: E1205 23:20:33.505108 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:33Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.510061 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.510120 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.510139 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.510160 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.510172 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:33Z","lastTransitionTime":"2025-12-05T23:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:33 crc kubenswrapper[4734]: E1205 23:20:33.525984 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:33Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.530421 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.530482 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.530495 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.530516 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.530556 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:33Z","lastTransitionTime":"2025-12-05T23:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:33 crc kubenswrapper[4734]: E1205 23:20:33.542990 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:33Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.547856 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.547890 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.547905 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.547922 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.547933 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:33Z","lastTransitionTime":"2025-12-05T23:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:33 crc kubenswrapper[4734]: E1205 23:20:33.563081 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:33Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:33 crc kubenswrapper[4734]: E1205 23:20:33.563257 4734 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.565487 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.565555 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.565567 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.565603 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.565615 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:33Z","lastTransitionTime":"2025-12-05T23:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.669137 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.669645 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.669725 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.669817 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.669939 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:33Z","lastTransitionTime":"2025-12-05T23:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.772853 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.772938 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.772964 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.772998 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.773028 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:33Z","lastTransitionTime":"2025-12-05T23:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.876116 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.876171 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.876180 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.876198 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.876211 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:33Z","lastTransitionTime":"2025-12-05T23:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.979749 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.979794 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.979823 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.979844 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:33 crc kubenswrapper[4734]: I1205 23:20:33.979857 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:33Z","lastTransitionTime":"2025-12-05T23:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.083416 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.083491 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.083501 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.083518 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.083555 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:34Z","lastTransitionTime":"2025-12-05T23:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.186301 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.186391 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.186404 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.186426 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.186440 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:34Z","lastTransitionTime":"2025-12-05T23:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.289163 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.289215 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.289226 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.289258 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.289271 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:34Z","lastTransitionTime":"2025-12-05T23:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.392166 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.392221 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.392235 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.392263 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.392283 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:34Z","lastTransitionTime":"2025-12-05T23:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.496013 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.496070 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.496081 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.496100 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.496113 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:34Z","lastTransitionTime":"2025-12-05T23:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.599858 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.600315 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.600401 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.600481 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.600575 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:34Z","lastTransitionTime":"2025-12-05T23:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.613355 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.613421 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.613632 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.613708 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:34 crc kubenswrapper[4734]: E1205 23:20:34.613870 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:34 crc kubenswrapper[4734]: E1205 23:20:34.614110 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:34 crc kubenswrapper[4734]: E1205 23:20:34.614218 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:34 crc kubenswrapper[4734]: E1205 23:20:34.614279 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.703835 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.703887 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.703899 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.703919 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.703933 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:34Z","lastTransitionTime":"2025-12-05T23:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.807874 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.808180 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.808241 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.808370 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.808437 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:34Z","lastTransitionTime":"2025-12-05T23:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.911583 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.912136 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.912296 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.912455 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:34 crc kubenswrapper[4734]: I1205 23:20:34.912646 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:34Z","lastTransitionTime":"2025-12-05T23:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.017344 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.017445 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.017462 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.017490 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.017514 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:35Z","lastTransitionTime":"2025-12-05T23:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.121387 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.121475 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.121496 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.121556 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.121580 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:35Z","lastTransitionTime":"2025-12-05T23:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.224510 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.224584 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.224594 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.224613 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.224623 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:35Z","lastTransitionTime":"2025-12-05T23:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.327833 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.327883 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.327897 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.327917 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.327932 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:35Z","lastTransitionTime":"2025-12-05T23:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.431669 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.431753 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.431777 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.431805 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.431823 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:35Z","lastTransitionTime":"2025-12-05T23:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.534719 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.534787 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.534802 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.534825 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.534842 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:35Z","lastTransitionTime":"2025-12-05T23:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.637401 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.637479 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.637503 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.637574 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.637601 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:35Z","lastTransitionTime":"2025-12-05T23:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.740670 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.740716 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.740730 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.740753 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.740770 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:35Z","lastTransitionTime":"2025-12-05T23:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.843376 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.843440 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.843450 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.843467 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.843478 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:35Z","lastTransitionTime":"2025-12-05T23:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.946185 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.946236 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.946250 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.946268 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:35 crc kubenswrapper[4734]: I1205 23:20:35.946280 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:35Z","lastTransitionTime":"2025-12-05T23:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.048421 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.048464 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.048474 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.048490 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.048502 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:36Z","lastTransitionTime":"2025-12-05T23:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.150683 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.150731 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.150743 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.150762 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.150774 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:36Z","lastTransitionTime":"2025-12-05T23:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.253449 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.253499 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.253511 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.253587 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.253625 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:36Z","lastTransitionTime":"2025-12-05T23:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.356675 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.356719 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.356729 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.356757 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.356769 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:36Z","lastTransitionTime":"2025-12-05T23:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.459665 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.459722 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.459738 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.459763 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.459780 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:36Z","lastTransitionTime":"2025-12-05T23:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.562355 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.562425 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.562446 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.562471 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.562490 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:36Z","lastTransitionTime":"2025-12-05T23:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.613317 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.613366 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.613398 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:36 crc kubenswrapper[4734]: E1205 23:20:36.613578 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:36 crc kubenswrapper[4734]: E1205 23:20:36.613786 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:36 crc kubenswrapper[4734]: E1205 23:20:36.613890 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.614409 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:36 crc kubenswrapper[4734]: E1205 23:20:36.614789 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.665899 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.665975 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.665998 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.666026 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.666047 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:36Z","lastTransitionTime":"2025-12-05T23:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.769793 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.769859 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.769880 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.769905 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.769924 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:36Z","lastTransitionTime":"2025-12-05T23:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.873387 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.873429 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.873438 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.873455 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.873490 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:36Z","lastTransitionTime":"2025-12-05T23:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.977164 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.977233 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.977250 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.977279 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:36 crc kubenswrapper[4734]: I1205 23:20:36.977297 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:36Z","lastTransitionTime":"2025-12-05T23:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.080555 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.080622 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.080636 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.080660 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.080675 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:37Z","lastTransitionTime":"2025-12-05T23:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.184352 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.184429 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.184445 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.184471 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.184485 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:37Z","lastTransitionTime":"2025-12-05T23:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.287330 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.287391 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.287402 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.287423 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.287437 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:37Z","lastTransitionTime":"2025-12-05T23:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.391287 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.391363 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.391374 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.391394 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.391408 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:37Z","lastTransitionTime":"2025-12-05T23:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.494063 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.494715 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.494774 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.494842 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.494871 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:37Z","lastTransitionTime":"2025-12-05T23:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.599007 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.599074 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.599088 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.599114 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.599131 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:37Z","lastTransitionTime":"2025-12-05T23:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.703007 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.703064 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.703077 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.703098 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.703113 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:37Z","lastTransitionTime":"2025-12-05T23:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.806757 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.806832 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.806856 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.806884 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.806905 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:37Z","lastTransitionTime":"2025-12-05T23:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.909704 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.909774 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.909786 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.909808 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:37 crc kubenswrapper[4734]: I1205 23:20:37.909826 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:37Z","lastTransitionTime":"2025-12-05T23:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.013884 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.013987 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.014051 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.014088 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.014165 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:38Z","lastTransitionTime":"2025-12-05T23:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.117864 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.117908 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.117919 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.117938 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.117951 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:38Z","lastTransitionTime":"2025-12-05T23:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.221012 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.221048 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.221057 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.221075 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.221086 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:38Z","lastTransitionTime":"2025-12-05T23:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.324793 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.324871 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.324893 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.324927 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.324948 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:38Z","lastTransitionTime":"2025-12-05T23:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.428078 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.428136 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.428147 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.428167 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.428182 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:38Z","lastTransitionTime":"2025-12-05T23:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.532151 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.532219 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.532237 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.532263 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.532281 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:38Z","lastTransitionTime":"2025-12-05T23:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.613496 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.613508 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:38 crc kubenswrapper[4734]: E1205 23:20:38.613903 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.613663 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:38 crc kubenswrapper[4734]: E1205 23:20:38.614047 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.613595 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:38 crc kubenswrapper[4734]: E1205 23:20:38.614343 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:20:38 crc kubenswrapper[4734]: E1205 23:20:38.614454 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.635496 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.635581 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.635601 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.635625 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.635645 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:38Z","lastTransitionTime":"2025-12-05T23:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.739586 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.739650 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.739671 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.739696 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.739714 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:38Z","lastTransitionTime":"2025-12-05T23:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.842847 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.842906 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.842917 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.842939 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.842951 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:38Z","lastTransitionTime":"2025-12-05T23:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.946006 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.946068 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.946085 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.946108 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:38 crc kubenswrapper[4734]: I1205 23:20:38.946125 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:38Z","lastTransitionTime":"2025-12-05T23:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.048998 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.049059 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.049078 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.049104 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.049123 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:39Z","lastTransitionTime":"2025-12-05T23:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.152416 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.152499 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.152564 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.152603 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.152629 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:39Z","lastTransitionTime":"2025-12-05T23:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.255732 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.255785 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.255795 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.255817 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.255830 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:39Z","lastTransitionTime":"2025-12-05T23:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.359269 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.359419 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.359438 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.359470 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.359490 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:39Z","lastTransitionTime":"2025-12-05T23:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.463157 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.463258 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.463280 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.463307 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.463328 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:39Z","lastTransitionTime":"2025-12-05T23:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.566627 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.566696 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.566714 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.566742 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.566763 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:39Z","lastTransitionTime":"2025-12-05T23:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.638005 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ed109ca95328fcc458e818da95462a941b14b4a4ad494d73190e64ec494c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:39Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.658780 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:39Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.669518 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.669807 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.669823 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.669847 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.669862 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:39Z","lastTransitionTime":"2025-12-05T23:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.681782 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:39Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.705162 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:39Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.722182 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:39Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.734659 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:39Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.747934 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:39Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.762483 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:39Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.772820 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.772851 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.772859 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.772873 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.772883 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:39Z","lastTransitionTime":"2025-12-05T23:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.776686 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b089eaa-85b7-420d-914f-b053257be3c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32d72232eb5162100a1a381e51548864fc732ff00fd26239351ec294328fc7fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9676ebc0e731c50baebbab917a9dc814ceea006a370980021eaeb8bf822825b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be62799b986e89e6324a37ffed14cfc15d4fa6efec043e842534075da2b7547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff9c5cbf877fad8c2d4155cab3be27491de84cf4b7f3476f60a02de39936ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff9c5cbf877fad8c2d4155cab3be27491de84cf4b7f3476f60a02de39936ab51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:39Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.792358 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:39Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.805767 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a85cf646-baec-45c1-a31e-97ce9e087c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a4d2f938eb5aab362754086f82c0bb45b25e167e76d2dbe7192c92982ea9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93325400e317291da2931220b981cce963abd9cf3cb36d1959f19d136c0d2134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wdk8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:39Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.821422 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:39Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.838317 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:39Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.850108 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:39Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.864300 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:39Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.875831 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.875894 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.875946 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.875967 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.875981 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:39Z","lastTransitionTime":"2025-12-05T23:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.884209 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccde57f4cb8d41050120cab8e9d3de18cee5141f9f3ae7bd5abf452b06c74e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccde57f4cb8d41050120cab8e9d3de18cee5141f9f3ae7bd5abf452b06c74e8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:30Z\\\",\\\"message\\\":\\\"id == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 23:20:29.708321 6322 services_controller.go:445] Built service openshift-dns-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF1205 23:20:29.708749 6322 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8bfg7_openshift-ovn-kubernetes(2927a376-2f69-4820-a222-b86f08ece55a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:39Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.906554 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l6r6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"641af4fe-dd54-4118-8985-d37a03d64f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l6r6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:39Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.979251 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.979290 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.979302 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.979318 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:39 crc kubenswrapper[4734]: I1205 23:20:39.979329 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:39Z","lastTransitionTime":"2025-12-05T23:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.081772 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.081824 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.081841 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.081864 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.081882 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:40Z","lastTransitionTime":"2025-12-05T23:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.184436 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.184565 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.184606 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.184636 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.184655 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:40Z","lastTransitionTime":"2025-12-05T23:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.288124 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.288187 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.288202 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.288230 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.288257 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:40Z","lastTransitionTime":"2025-12-05T23:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.391391 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.391626 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.391636 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.391657 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.391669 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:40Z","lastTransitionTime":"2025-12-05T23:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.494011 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.494055 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.494065 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.494083 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.494094 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:40Z","lastTransitionTime":"2025-12-05T23:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.596735 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.596779 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.596789 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.596809 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.596822 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:40Z","lastTransitionTime":"2025-12-05T23:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.613688 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.613761 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.613796 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.613763 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:40 crc kubenswrapper[4734]: E1205 23:20:40.613892 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:20:40 crc kubenswrapper[4734]: E1205 23:20:40.614008 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:40 crc kubenswrapper[4734]: E1205 23:20:40.614175 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:40 crc kubenswrapper[4734]: E1205 23:20:40.614300 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.699554 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.699615 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.699626 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.699647 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.699659 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:40Z","lastTransitionTime":"2025-12-05T23:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.802821 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.802871 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.802882 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.802902 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.802915 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:40Z","lastTransitionTime":"2025-12-05T23:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.906164 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.906207 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.906218 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.906237 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:40 crc kubenswrapper[4734]: I1205 23:20:40.906252 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:40Z","lastTransitionTime":"2025-12-05T23:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.009587 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.009645 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.009656 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.009677 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.009687 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:41Z","lastTransitionTime":"2025-12-05T23:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.112498 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.112603 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.112622 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.112648 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.112665 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:41Z","lastTransitionTime":"2025-12-05T23:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.215887 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.215986 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.216026 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.216065 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.216086 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:41Z","lastTransitionTime":"2025-12-05T23:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.319375 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.319451 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.319465 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.319491 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.319509 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:41Z","lastTransitionTime":"2025-12-05T23:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.422913 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.422995 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.423015 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.423047 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.423076 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:41Z","lastTransitionTime":"2025-12-05T23:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.526460 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.526560 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.526574 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.526596 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.526609 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:41Z","lastTransitionTime":"2025-12-05T23:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.629160 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.629216 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.629233 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.629255 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.629275 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:41Z","lastTransitionTime":"2025-12-05T23:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.732196 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.732261 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.732285 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.732318 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.732342 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:41Z","lastTransitionTime":"2025-12-05T23:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.840266 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.840317 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.840335 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.840361 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.840382 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:41Z","lastTransitionTime":"2025-12-05T23:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.943348 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.943404 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.943430 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.943457 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:41 crc kubenswrapper[4734]: I1205 23:20:41.943475 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:41Z","lastTransitionTime":"2025-12-05T23:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.046930 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.046977 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.046989 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.047006 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.047018 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:42Z","lastTransitionTime":"2025-12-05T23:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.154679 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.154754 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.154776 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.154806 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.154840 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:42Z","lastTransitionTime":"2025-12-05T23:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.259002 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.259075 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.259100 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.259134 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.259152 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:42Z","lastTransitionTime":"2025-12-05T23:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.362627 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.362967 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.363051 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.363299 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.363394 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:42Z","lastTransitionTime":"2025-12-05T23:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.466393 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.466444 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.466455 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.466474 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.466485 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:42Z","lastTransitionTime":"2025-12-05T23:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.569822 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.569889 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.569902 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.569926 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.569941 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:42Z","lastTransitionTime":"2025-12-05T23:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.613066 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.613097 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.613134 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.613109 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:42 crc kubenswrapper[4734]: E1205 23:20:42.613257 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:42 crc kubenswrapper[4734]: E1205 23:20:42.613495 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:42 crc kubenswrapper[4734]: E1205 23:20:42.613640 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:42 crc kubenswrapper[4734]: E1205 23:20:42.613687 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.673566 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.673605 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.673615 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.673632 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.673645 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:42Z","lastTransitionTime":"2025-12-05T23:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.775780 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.775822 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.775834 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.775852 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.775864 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:42Z","lastTransitionTime":"2025-12-05T23:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.878329 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.878408 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.878422 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.878461 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.878477 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:42Z","lastTransitionTime":"2025-12-05T23:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.981295 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.981351 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.981363 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.981382 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:42 crc kubenswrapper[4734]: I1205 23:20:42.981394 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:42Z","lastTransitionTime":"2025-12-05T23:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.084397 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.084758 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.084821 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.084888 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.084970 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:43Z","lastTransitionTime":"2025-12-05T23:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.188282 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.188325 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.188335 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.188352 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.188363 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:43Z","lastTransitionTime":"2025-12-05T23:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.291325 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.291367 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.291376 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.291393 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.291405 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:43Z","lastTransitionTime":"2025-12-05T23:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.394886 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.394949 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.394963 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.394991 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.395010 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:43Z","lastTransitionTime":"2025-12-05T23:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.502434 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.502481 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.502493 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.502510 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.502543 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:43Z","lastTransitionTime":"2025-12-05T23:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.604811 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.604871 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.604885 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.604906 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.604922 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:43Z","lastTransitionTime":"2025-12-05T23:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.614504 4734 scope.go:117] "RemoveContainer" containerID="ccde57f4cb8d41050120cab8e9d3de18cee5141f9f3ae7bd5abf452b06c74e8c" Dec 05 23:20:43 crc kubenswrapper[4734]: E1205 23:20:43.614710 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8bfg7_openshift-ovn-kubernetes(2927a376-2f69-4820-a222-b86f08ece55a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" podUID="2927a376-2f69-4820-a222-b86f08ece55a" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.708543 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.708599 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.708613 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.708633 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.708646 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:43Z","lastTransitionTime":"2025-12-05T23:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.812266 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.812317 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.812331 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.812351 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.812365 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:43Z","lastTransitionTime":"2025-12-05T23:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.816434 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.816472 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.816486 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.816499 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.816509 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:43Z","lastTransitionTime":"2025-12-05T23:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:43 crc kubenswrapper[4734]: E1205 23:20:43.831608 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:43Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.836046 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.836168 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.836233 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.836320 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.836400 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:43Z","lastTransitionTime":"2025-12-05T23:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:43 crc kubenswrapper[4734]: E1205 23:20:43.849989 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:43Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.854944 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.854999 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.855017 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.855041 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.855056 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:43Z","lastTransitionTime":"2025-12-05T23:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:43 crc kubenswrapper[4734]: E1205 23:20:43.868562 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:43Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.876672 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.876724 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.876740 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.876770 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.876783 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:43Z","lastTransitionTime":"2025-12-05T23:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:43 crc kubenswrapper[4734]: E1205 23:20:43.894660 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:43Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.899601 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.899776 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.899839 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.899918 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.900000 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:43Z","lastTransitionTime":"2025-12-05T23:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:43 crc kubenswrapper[4734]: E1205 23:20:43.916052 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:43Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:43 crc kubenswrapper[4734]: E1205 23:20:43.916242 4734 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.918218 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.918266 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.918276 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.918301 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:43 crc kubenswrapper[4734]: I1205 23:20:43.918312 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:43Z","lastTransitionTime":"2025-12-05T23:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.021309 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.021764 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.021838 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.021930 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.021998 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:44Z","lastTransitionTime":"2025-12-05T23:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.125284 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.125347 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.125358 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.125378 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.125392 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:44Z","lastTransitionTime":"2025-12-05T23:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.228168 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.228234 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.228247 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.228270 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.228286 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:44Z","lastTransitionTime":"2025-12-05T23:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.331231 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.331310 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.331324 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.331345 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.331358 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:44Z","lastTransitionTime":"2025-12-05T23:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.433601 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.433642 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.433651 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.433673 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.433684 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:44Z","lastTransitionTime":"2025-12-05T23:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.536608 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.536672 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.536686 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.536709 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.536725 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:44Z","lastTransitionTime":"2025-12-05T23:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.613675 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.613712 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.613831 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:44 crc kubenswrapper[4734]: E1205 23:20:44.613881 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.613911 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:44 crc kubenswrapper[4734]: E1205 23:20:44.613998 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:20:44 crc kubenswrapper[4734]: E1205 23:20:44.614221 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:44 crc kubenswrapper[4734]: E1205 23:20:44.614356 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.639217 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.639266 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.639279 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.639304 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.639318 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:44Z","lastTransitionTime":"2025-12-05T23:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.741986 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.742045 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.742059 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.742086 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.742100 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:44Z","lastTransitionTime":"2025-12-05T23:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.845488 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.845584 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.845606 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.845634 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.845659 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:44Z","lastTransitionTime":"2025-12-05T23:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.949512 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.949622 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.949638 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.949667 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:44 crc kubenswrapper[4734]: I1205 23:20:44.949686 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:44Z","lastTransitionTime":"2025-12-05T23:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.053085 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.053136 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.053155 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.053178 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.053190 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:45Z","lastTransitionTime":"2025-12-05T23:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.156367 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.156423 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.156438 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.156461 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.156475 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:45Z","lastTransitionTime":"2025-12-05T23:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.259249 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.259301 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.259310 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.259332 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.259349 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:45Z","lastTransitionTime":"2025-12-05T23:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.362223 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.362277 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.362289 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.362312 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.362326 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:45Z","lastTransitionTime":"2025-12-05T23:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.465129 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.465188 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.465202 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.465224 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.465238 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:45Z","lastTransitionTime":"2025-12-05T23:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.568129 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.568176 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.568187 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.568207 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.568218 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:45Z","lastTransitionTime":"2025-12-05T23:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.671668 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.671711 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.671729 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.671753 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.671769 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:45Z","lastTransitionTime":"2025-12-05T23:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.774707 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.775046 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.775062 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.775081 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.775092 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:45Z","lastTransitionTime":"2025-12-05T23:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.882588 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.882637 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.882650 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.882671 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.882685 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:45Z","lastTransitionTime":"2025-12-05T23:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.985544 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.985590 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.985602 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.985622 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:45 crc kubenswrapper[4734]: I1205 23:20:45.985636 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:45Z","lastTransitionTime":"2025-12-05T23:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.088842 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.088895 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.088913 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.088935 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.088952 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:46Z","lastTransitionTime":"2025-12-05T23:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.191876 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.191943 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.191959 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.191985 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.192003 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:46Z","lastTransitionTime":"2025-12-05T23:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.295661 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.295732 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.295750 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.295777 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.295795 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:46Z","lastTransitionTime":"2025-12-05T23:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.398937 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.399008 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.399024 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.399055 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.399073 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:46Z","lastTransitionTime":"2025-12-05T23:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.502800 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.502851 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.502863 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.502884 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.502896 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:46Z","lastTransitionTime":"2025-12-05T23:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.606198 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.606271 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.606290 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.606323 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.606341 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:46Z","lastTransitionTime":"2025-12-05T23:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.613776 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.613777 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.613832 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.613815 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:46 crc kubenswrapper[4734]: E1205 23:20:46.613957 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:46 crc kubenswrapper[4734]: E1205 23:20:46.614052 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:46 crc kubenswrapper[4734]: E1205 23:20:46.614279 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:46 crc kubenswrapper[4734]: E1205 23:20:46.614480 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.710555 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.710635 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.710654 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.710686 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.710707 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:46Z","lastTransitionTime":"2025-12-05T23:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.716189 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/641af4fe-dd54-4118-8985-d37a03d64f79-metrics-certs\") pod \"network-metrics-daemon-l6r6g\" (UID: \"641af4fe-dd54-4118-8985-d37a03d64f79\") " pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:46 crc kubenswrapper[4734]: E1205 23:20:46.716471 4734 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 23:20:46 crc kubenswrapper[4734]: E1205 23:20:46.716621 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/641af4fe-dd54-4118-8985-d37a03d64f79-metrics-certs podName:641af4fe-dd54-4118-8985-d37a03d64f79 nodeName:}" failed. No retries permitted until 2025-12-05 23:21:18.716582765 +0000 UTC m=+99.399987241 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/641af4fe-dd54-4118-8985-d37a03d64f79-metrics-certs") pod "network-metrics-daemon-l6r6g" (UID: "641af4fe-dd54-4118-8985-d37a03d64f79") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.813410 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.813464 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.813476 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.813499 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.813510 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:46Z","lastTransitionTime":"2025-12-05T23:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.916203 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.916249 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.916264 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.916286 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:46 crc kubenswrapper[4734]: I1205 23:20:46.916298 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:46Z","lastTransitionTime":"2025-12-05T23:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.019615 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.019698 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.019711 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.019740 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.019757 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:47Z","lastTransitionTime":"2025-12-05T23:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.122073 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.122113 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.122123 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.122156 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.122168 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:47Z","lastTransitionTime":"2025-12-05T23:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.224763 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.224809 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.224819 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.224839 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.224851 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:47Z","lastTransitionTime":"2025-12-05T23:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.327277 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.327320 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.327331 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.327350 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.327361 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:47Z","lastTransitionTime":"2025-12-05T23:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.430776 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.430853 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.430875 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.430912 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.430939 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:47Z","lastTransitionTime":"2025-12-05T23:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.533434 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.533492 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.533511 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.533577 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.533600 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:47Z","lastTransitionTime":"2025-12-05T23:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.635834 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.635885 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.635896 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.635914 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.635926 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:47Z","lastTransitionTime":"2025-12-05T23:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.738685 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.738758 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.738772 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.738792 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.738810 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:47Z","lastTransitionTime":"2025-12-05T23:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.841580 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.841649 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.841662 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.841690 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.841704 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:47Z","lastTransitionTime":"2025-12-05T23:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.944134 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.944183 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.944199 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.944222 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:47 crc kubenswrapper[4734]: I1205 23:20:47.944237 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:47Z","lastTransitionTime":"2025-12-05T23:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.048316 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.048376 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.048387 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.048409 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.048421 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:48Z","lastTransitionTime":"2025-12-05T23:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.091250 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6kmh_1d76dc4e-40f3-4457-9a99-16f9a8ca8081/kube-multus/0.log" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.091317 4734 generic.go:334] "Generic (PLEG): container finished" podID="1d76dc4e-40f3-4457-9a99-16f9a8ca8081" containerID="ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8" exitCode=1 Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.091376 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d6kmh" event={"ID":"1d76dc4e-40f3-4457-9a99-16f9a8ca8081","Type":"ContainerDied","Data":"ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8"} Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.091992 4734 scope.go:117] "RemoveContainer" containerID="ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.107682 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:48Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.126601 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a85cf646-baec-45c1-a31e-97ce9e087c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a4d2f938eb5aab362754086f82c0bb45b25e167e76d2dbe7192c92982ea9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93325400e317291da2931220b981cce963abd9cf3cb36d1959f19d136c0d2134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wdk8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:48Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.147786 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:47Z\\\",\\\"message\\\":\\\"2025-12-05T23:20:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_90161e0f-9906-4827-895c-9cd783dd3007\\\\n2025-12-05T23:20:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_90161e0f-9906-4827-895c-9cd783dd3007 to /host/opt/cni/bin/\\\\n2025-12-05T23:20:02Z [verbose] multus-daemon started\\\\n2025-12-05T23:20:02Z [verbose] Readiness Indicator file check\\\\n2025-12-05T23:20:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:48Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.151595 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.151631 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.151641 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.151660 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.151671 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:48Z","lastTransitionTime":"2025-12-05T23:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.174247 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccde57f4cb8d41050120cab8e9d3de18cee5141f9f3ae7bd5abf452b06c74e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccde57f4cb8d41050120cab8e9d3de18cee5141f9f3ae7bd5abf452b06c74e8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:30Z\\\",\\\"message\\\":\\\"id == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 23:20:29.708321 6322 services_controller.go:445] Built service openshift-dns-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF1205 23:20:29.708749 6322 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8bfg7_openshift-ovn-kubernetes(2927a376-2f69-4820-a222-b86f08ece55a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:48Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.189357 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l6r6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"641af4fe-dd54-4118-8985-d37a03d64f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l6r6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:48Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.204746 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:48Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.221728 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:48Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.233928 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:48Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.250131 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:48Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.254245 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.254291 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.254306 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.254324 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.254342 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:48Z","lastTransitionTime":"2025-12-05T23:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.263923 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:48Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.279764 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:48Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.292557 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:48Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.307882 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ed109ca95328fcc458e818da95462a941b14b4a4ad494d73190e64ec494c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:48Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.322334 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:48Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.337130 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:48Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.352575 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:48Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.357034 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.357082 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.357095 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.357114 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.357128 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:48Z","lastTransitionTime":"2025-12-05T23:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.369721 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b089eaa-85b7-420d-914f-b053257be3c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32d72232eb5162100a1a381e51548864fc732ff00fd26239351ec294328fc7fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9676ebc0e731c50baebbab917a9dc814ceea006a370980021eaeb8bf822825b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be62799b986e89e6324a37ffed14cfc15d4fa6efec043e842534075da2b7547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff9c5cbf877fad8c2d4155cab3be27491de84cf4b7f3476f60a02de39936ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff9c5cbf877fad8c2d4155cab3be27491de84cf4b7f3476f60a02de39936ab51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:48Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.460285 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.460354 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.460365 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.460384 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.460394 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:48Z","lastTransitionTime":"2025-12-05T23:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.563169 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.563216 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.563228 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.563251 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.563261 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:48Z","lastTransitionTime":"2025-12-05T23:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.613192 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:48 crc kubenswrapper[4734]: E1205 23:20:48.613352 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.613603 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:48 crc kubenswrapper[4734]: E1205 23:20:48.613660 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.613764 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:48 crc kubenswrapper[4734]: E1205 23:20:48.613827 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.615767 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:48 crc kubenswrapper[4734]: E1205 23:20:48.615855 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.666637 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.666694 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.666706 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.666726 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.666740 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:48Z","lastTransitionTime":"2025-12-05T23:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.770023 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.770072 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.770083 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.770103 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.770116 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:48Z","lastTransitionTime":"2025-12-05T23:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.873572 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.873621 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.873633 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.873653 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.873665 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:48Z","lastTransitionTime":"2025-12-05T23:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.976807 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.976853 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.976866 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.976883 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:48 crc kubenswrapper[4734]: I1205 23:20:48.976895 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:48Z","lastTransitionTime":"2025-12-05T23:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.079338 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.079750 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.079832 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.079925 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.080005 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:49Z","lastTransitionTime":"2025-12-05T23:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.097066 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6kmh_1d76dc4e-40f3-4457-9a99-16f9a8ca8081/kube-multus/0.log" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.097557 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d6kmh" event={"ID":"1d76dc4e-40f3-4457-9a99-16f9a8ca8081","Type":"ContainerStarted","Data":"8453d43131f407bdf61410dd38713b44aea86c8647825551f40b2c41552206e8"} Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.117073 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b089eaa-85b7-420d-914f-b053257be3c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32d72232eb5162100a1a381e51548864fc732ff00fd26239351ec294328fc7fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9676ebc0e731c50baebbab917a9dc814ceea006a370980021eaeb8bf822825b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be62799b986e89e6324a37ffed14cfc15d4fa6efec043e842534075da2b7547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff9c5cbf877fad8c2d4155cab3be27491de84cf4b7f3476f60a02de39936ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff9c5cbf877fad8c2d4155cab3be27491de84cf4b7f3476f60a02de39936ab51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.131303 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.149042 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.164139 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a85cf646-baec-45c1-a31e-97ce9e087c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a4d2f938eb5aab362754086f82c0bb45b25e167e76d2dbe7192c92982ea9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93325400e317291da2931220b981cce963abd9cf3cb36d1959f19d136c0d2134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wdk8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.176499 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.183768 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.183836 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.183852 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.183876 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.183897 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:49Z","lastTransitionTime":"2025-12-05T23:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.187600 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.201222 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8453d43131f407bdf61410dd38713b44aea86c8647825551f40b2c41552206e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:47Z\\\",\\\"message\\\":\\\"2025-12-05T23:20:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_90161e0f-9906-4827-895c-9cd783dd3007\\\\n2025-12-05T23:20:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_90161e0f-9906-4827-895c-9cd783dd3007 to /host/opt/cni/bin/\\\\n2025-12-05T23:20:02Z [verbose] multus-daemon started\\\\n2025-12-05T23:20:02Z [verbose] Readiness Indicator file check\\\\n2025-12-05T23:20:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.225820 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccde57f4cb8d41050120cab8e9d3de18cee5141f9f3ae7bd5abf452b06c74e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccde57f4cb8d41050120cab8e9d3de18cee5141f9f3ae7bd5abf452b06c74e8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:30Z\\\",\\\"message\\\":\\\"id == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 23:20:29.708321 6322 services_controller.go:445] Built service openshift-dns-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF1205 23:20:29.708749 6322 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8bfg7_openshift-ovn-kubernetes(2927a376-2f69-4820-a222-b86f08ece55a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.238486 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l6r6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"641af4fe-dd54-4118-8985-d37a03d64f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l6r6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.252191 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.264902 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.277920 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.287037 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.287068 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.287079 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.287099 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.287110 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:49Z","lastTransitionTime":"2025-12-05T23:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.289503 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.301443 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.312911 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.323894 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.339608 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ed109ca95328fcc458e818da95462a941b14b4a4ad494d73190e64ec494c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.389882 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.389919 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.389929 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.389949 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.389964 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:49Z","lastTransitionTime":"2025-12-05T23:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.493384 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.493438 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.493453 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.493477 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.493494 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:49Z","lastTransitionTime":"2025-12-05T23:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.596715 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.596765 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.596783 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.596804 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.596818 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:49Z","lastTransitionTime":"2025-12-05T23:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.629156 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8453d43131f407bdf61410dd38713b44aea86c8647825551f40b2c41552206e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:47Z\\\",\\\"message\\\":\\\"2025-12-05T23:20:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_90161e0f-9906-4827-895c-9cd783dd3007\\\\n2025-12-05T23:20:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_90161e0f-9906-4827-895c-9cd783dd3007 to /host/opt/cni/bin/\\\\n2025-12-05T23:20:02Z [verbose] multus-daemon started\\\\n2025-12-05T23:20:02Z [verbose] Readiness Indicator file check\\\\n2025-12-05T23:20:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.648770 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccde57f4cb8d41050120cab8e9d3de18cee5141f9f3ae7bd5abf452b06c74e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccde57f4cb8d41050120cab8e9d3de18cee5141f9f3ae7bd5abf452b06c74e8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:30Z\\\",\\\"message\\\":\\\"id == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 23:20:29.708321 6322 services_controller.go:445] Built service openshift-dns-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF1205 23:20:29.708749 6322 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8bfg7_openshift-ovn-kubernetes(2927a376-2f69-4820-a222-b86f08ece55a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.659452 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l6r6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"641af4fe-dd54-4118-8985-d37a03d64f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l6r6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.674806 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.688242 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.698641 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.698680 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.698691 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.698711 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.698726 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:49Z","lastTransitionTime":"2025-12-05T23:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.703398 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.722623 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.736708 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.748169 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.761397 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.779376 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ed109ca95328fcc458e818da95462a941b14b4a4ad494d73190e64ec494c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.792829 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.801479 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.801512 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.801536 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.801559 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.801570 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:49Z","lastTransitionTime":"2025-12-05T23:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.805093 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.817976 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.828869 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b089eaa-85b7-420d-914f-b053257be3c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32d72232eb5162100a1a381e51548864fc732ff00fd26239351ec294328fc7fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9676ebc0e731c50baebbab917a9dc814ceea006a370980021eaeb8bf822825b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be62799b986e89e6324a37ffed14cfc15d4fa6efec043e842534075da2b7547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff9c5cbf877fad8c2d4155cab3be27491de84cf4b7f3476f60a02de39936ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff9c5cbf877fad8c2d4155cab3be27491de84cf4b7f3476f60a02de39936ab51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.840018 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.852434 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a85cf646-baec-45c1-a31e-97ce9e087c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a4d2f938eb5aab362754086f82c0bb45b25e167e76d2dbe7192c92982ea9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93325400e317291da2931220b981cce963abd9cf3cb36d1959f19d136c0d2134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wdk8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:49Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.905131 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.905177 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.905188 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.905209 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:49 crc kubenswrapper[4734]: I1205 23:20:49.905223 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:49Z","lastTransitionTime":"2025-12-05T23:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.008185 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.008229 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.008239 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.008258 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.008275 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:50Z","lastTransitionTime":"2025-12-05T23:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.111764 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.111827 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.111841 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.111863 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.111875 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:50Z","lastTransitionTime":"2025-12-05T23:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.215433 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.215470 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.215481 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.215498 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.215508 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:50Z","lastTransitionTime":"2025-12-05T23:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.318347 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.318387 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.318398 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.318415 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.318427 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:50Z","lastTransitionTime":"2025-12-05T23:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.421213 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.421277 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.421295 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.421319 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.421335 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:50Z","lastTransitionTime":"2025-12-05T23:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.525073 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.525295 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.525311 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.525332 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.525346 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:50Z","lastTransitionTime":"2025-12-05T23:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.613296 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.613440 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:50 crc kubenswrapper[4734]: E1205 23:20:50.613592 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:20:50 crc kubenswrapper[4734]: E1205 23:20:50.613919 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.614000 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.614092 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:50 crc kubenswrapper[4734]: E1205 23:20:50.614211 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:50 crc kubenswrapper[4734]: E1205 23:20:50.614287 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.628338 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.628391 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.628407 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.628427 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.628441 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:50Z","lastTransitionTime":"2025-12-05T23:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.731598 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.731660 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.731674 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.731698 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.731717 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:50Z","lastTransitionTime":"2025-12-05T23:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.835052 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.835099 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.835118 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.835151 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.835166 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:50Z","lastTransitionTime":"2025-12-05T23:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.938617 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.938673 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.938686 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.938707 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:50 crc kubenswrapper[4734]: I1205 23:20:50.938719 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:50Z","lastTransitionTime":"2025-12-05T23:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.041769 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.041833 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.041849 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.041871 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.041885 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:51Z","lastTransitionTime":"2025-12-05T23:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.144921 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.145015 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.145037 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.145066 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.145087 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:51Z","lastTransitionTime":"2025-12-05T23:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.247581 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.247644 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.247669 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.247694 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.247708 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:51Z","lastTransitionTime":"2025-12-05T23:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.350141 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.350191 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.350231 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.350251 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.350260 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:51Z","lastTransitionTime":"2025-12-05T23:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.453326 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.453381 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.453397 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.453426 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.453440 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:51Z","lastTransitionTime":"2025-12-05T23:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.556514 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.556620 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.556638 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.556664 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.556681 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:51Z","lastTransitionTime":"2025-12-05T23:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.660625 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.660683 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.660704 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.660730 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.660751 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:51Z","lastTransitionTime":"2025-12-05T23:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.764079 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.764159 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.764178 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.764205 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.764223 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:51Z","lastTransitionTime":"2025-12-05T23:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.866600 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.866662 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.866682 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.866707 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.866727 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:51Z","lastTransitionTime":"2025-12-05T23:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.970331 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.970439 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.970471 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.970501 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:51 crc kubenswrapper[4734]: I1205 23:20:51.970548 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:51Z","lastTransitionTime":"2025-12-05T23:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.074767 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.075306 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.075320 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.075341 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.075353 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:52Z","lastTransitionTime":"2025-12-05T23:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.177771 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.177833 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.177843 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.177864 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.177880 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:52Z","lastTransitionTime":"2025-12-05T23:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.280031 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.280088 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.280105 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.280128 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.280145 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:52Z","lastTransitionTime":"2025-12-05T23:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.382912 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.382963 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.382972 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.382993 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.383005 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:52Z","lastTransitionTime":"2025-12-05T23:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.486204 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.486277 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.486299 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.486330 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.486354 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:52Z","lastTransitionTime":"2025-12-05T23:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.589627 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.589710 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.589736 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.589770 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.589792 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:52Z","lastTransitionTime":"2025-12-05T23:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.613302 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.613354 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.613392 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.613428 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:52 crc kubenswrapper[4734]: E1205 23:20:52.613510 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:52 crc kubenswrapper[4734]: E1205 23:20:52.613615 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:52 crc kubenswrapper[4734]: E1205 23:20:52.613738 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:20:52 crc kubenswrapper[4734]: E1205 23:20:52.613811 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.693357 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.693431 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.693448 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.693475 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.693493 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:52Z","lastTransitionTime":"2025-12-05T23:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.797396 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.797461 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.797471 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.797492 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.797510 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:52Z","lastTransitionTime":"2025-12-05T23:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.900549 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.900606 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.900619 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.900639 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:52 crc kubenswrapper[4734]: I1205 23:20:52.900652 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:52Z","lastTransitionTime":"2025-12-05T23:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.003568 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.003618 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.003631 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.003650 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.003663 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:53Z","lastTransitionTime":"2025-12-05T23:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.107510 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.107645 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.107674 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.107710 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.107745 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:53Z","lastTransitionTime":"2025-12-05T23:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.211359 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.211447 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.211471 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.211507 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.211572 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:53Z","lastTransitionTime":"2025-12-05T23:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.315823 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.315877 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.315890 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.315914 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.315927 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:53Z","lastTransitionTime":"2025-12-05T23:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.418909 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.418964 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.418976 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.418993 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.419003 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:53Z","lastTransitionTime":"2025-12-05T23:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.522054 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.522099 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.522110 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.522128 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.522141 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:53Z","lastTransitionTime":"2025-12-05T23:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.624873 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.624945 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.624963 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.624987 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.625007 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:53Z","lastTransitionTime":"2025-12-05T23:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.727836 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.727923 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.727944 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.727974 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.727994 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:53Z","lastTransitionTime":"2025-12-05T23:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.831609 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.831667 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.831678 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.831702 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.831718 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:53Z","lastTransitionTime":"2025-12-05T23:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.935082 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.935159 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.935177 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.935210 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.935229 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:53Z","lastTransitionTime":"2025-12-05T23:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.997130 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.997203 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.997224 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.997251 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:53 crc kubenswrapper[4734]: I1205 23:20:53.997272 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:53Z","lastTransitionTime":"2025-12-05T23:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:54 crc kubenswrapper[4734]: E1205 23:20:54.011863 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:54Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.017721 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.017770 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.017783 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.017805 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.017820 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:54Z","lastTransitionTime":"2025-12-05T23:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:54 crc kubenswrapper[4734]: E1205 23:20:54.030540 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:54Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.035234 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.035302 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.035315 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.035339 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.035353 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:54Z","lastTransitionTime":"2025-12-05T23:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:54 crc kubenswrapper[4734]: E1205 23:20:54.050368 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:54Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.055682 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.055759 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.055771 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.055794 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.055808 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:54Z","lastTransitionTime":"2025-12-05T23:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:54 crc kubenswrapper[4734]: E1205 23:20:54.073606 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:54Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.078073 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.078114 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.078125 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.078172 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.078186 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:54Z","lastTransitionTime":"2025-12-05T23:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:54 crc kubenswrapper[4734]: E1205 23:20:54.091337 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:54Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:54 crc kubenswrapper[4734]: E1205 23:20:54.091465 4734 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.093564 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.093622 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.093642 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.093672 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.093692 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:54Z","lastTransitionTime":"2025-12-05T23:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.196841 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.196894 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.196911 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.196934 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.196953 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:54Z","lastTransitionTime":"2025-12-05T23:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.301899 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.301941 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.301951 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.301971 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.301985 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:54Z","lastTransitionTime":"2025-12-05T23:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.405006 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.405076 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.405085 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.405124 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.405138 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:54Z","lastTransitionTime":"2025-12-05T23:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.507996 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.508045 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.508057 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.508077 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.508088 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:54Z","lastTransitionTime":"2025-12-05T23:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.610780 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.610857 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.610906 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.610936 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.610956 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:54Z","lastTransitionTime":"2025-12-05T23:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.613040 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.613088 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.613133 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.613044 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:54 crc kubenswrapper[4734]: E1205 23:20:54.613218 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:54 crc kubenswrapper[4734]: E1205 23:20:54.613362 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:54 crc kubenswrapper[4734]: E1205 23:20:54.613618 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:20:54 crc kubenswrapper[4734]: E1205 23:20:54.620222 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.714422 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.714503 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.714603 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.714631 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.714651 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:54Z","lastTransitionTime":"2025-12-05T23:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.817933 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.818000 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.818021 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.818052 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.818075 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:54Z","lastTransitionTime":"2025-12-05T23:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.921234 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.921310 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.921334 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.921362 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:54 crc kubenswrapper[4734]: I1205 23:20:54.921382 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:54Z","lastTransitionTime":"2025-12-05T23:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.024206 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.024286 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.024313 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.024344 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.024365 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:55Z","lastTransitionTime":"2025-12-05T23:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.127303 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.127369 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.127386 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.127410 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.127429 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:55Z","lastTransitionTime":"2025-12-05T23:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.231267 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.231318 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.231330 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.231348 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.231361 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:55Z","lastTransitionTime":"2025-12-05T23:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.334794 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.334860 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.334880 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.334907 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.334927 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:55Z","lastTransitionTime":"2025-12-05T23:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.438049 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.438108 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.438125 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.438150 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.438205 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:55Z","lastTransitionTime":"2025-12-05T23:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.540977 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.541036 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.541054 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.541081 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.541100 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:55Z","lastTransitionTime":"2025-12-05T23:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.644855 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.645312 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.645503 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.645783 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.646011 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:55Z","lastTransitionTime":"2025-12-05T23:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.749868 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.749927 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.749940 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.749960 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.749972 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:55Z","lastTransitionTime":"2025-12-05T23:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.853476 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.853560 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.853578 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.853599 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.853616 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:55Z","lastTransitionTime":"2025-12-05T23:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.956125 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.956191 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.956219 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.956251 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:55 crc kubenswrapper[4734]: I1205 23:20:55.956273 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:55Z","lastTransitionTime":"2025-12-05T23:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.058503 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.058608 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.058638 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.058667 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.058688 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:56Z","lastTransitionTime":"2025-12-05T23:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.161415 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.161488 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.161511 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.161578 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.161599 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:56Z","lastTransitionTime":"2025-12-05T23:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.264207 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.264252 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.264261 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.264280 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.264290 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:56Z","lastTransitionTime":"2025-12-05T23:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.367709 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.367771 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.367784 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.367805 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.367817 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:56Z","lastTransitionTime":"2025-12-05T23:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.472266 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.472335 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.472350 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.472381 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.472397 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:56Z","lastTransitionTime":"2025-12-05T23:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.575969 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.576017 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.576026 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.576044 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.576053 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:56Z","lastTransitionTime":"2025-12-05T23:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.613808 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.613842 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.613886 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.613829 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:56 crc kubenswrapper[4734]: E1205 23:20:56.614004 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:56 crc kubenswrapper[4734]: E1205 23:20:56.614191 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:56 crc kubenswrapper[4734]: E1205 23:20:56.614284 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:56 crc kubenswrapper[4734]: E1205 23:20:56.614347 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.679243 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.679317 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.679336 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.679363 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.679381 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:56Z","lastTransitionTime":"2025-12-05T23:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.782692 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.782742 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.782760 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.782782 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.782800 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:56Z","lastTransitionTime":"2025-12-05T23:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.885335 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.885417 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.885437 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.885461 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.885514 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:56Z","lastTransitionTime":"2025-12-05T23:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.988988 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.989113 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.989139 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.989169 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:56 crc kubenswrapper[4734]: I1205 23:20:56.989192 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:56Z","lastTransitionTime":"2025-12-05T23:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.092574 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.092650 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.092668 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.092693 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.092713 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:57Z","lastTransitionTime":"2025-12-05T23:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.196184 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.196268 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.196292 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.196323 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.196347 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:57Z","lastTransitionTime":"2025-12-05T23:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.299354 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.299422 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.299450 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.299472 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.299490 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:57Z","lastTransitionTime":"2025-12-05T23:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.402967 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.403005 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.403017 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.403033 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.403044 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:57Z","lastTransitionTime":"2025-12-05T23:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.506745 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.506827 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.506855 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.506890 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.506918 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:57Z","lastTransitionTime":"2025-12-05T23:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.611051 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.611120 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.611137 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.611162 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.611180 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:57Z","lastTransitionTime":"2025-12-05T23:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.715197 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.715273 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.715296 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.715327 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.715351 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:57Z","lastTransitionTime":"2025-12-05T23:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.818852 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.818930 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.818944 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.818971 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.818987 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:57Z","lastTransitionTime":"2025-12-05T23:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.922930 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.923006 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.923019 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.923037 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:57 crc kubenswrapper[4734]: I1205 23:20:57.923048 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:57Z","lastTransitionTime":"2025-12-05T23:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.026614 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.026682 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.026701 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.026728 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.026748 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:58Z","lastTransitionTime":"2025-12-05T23:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.129716 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.129787 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.129804 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.129830 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.129848 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:58Z","lastTransitionTime":"2025-12-05T23:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.233707 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.233784 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.233810 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.233839 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.233860 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:58Z","lastTransitionTime":"2025-12-05T23:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.337842 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.337928 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.337954 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.337989 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.338012 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:58Z","lastTransitionTime":"2025-12-05T23:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.441112 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.441177 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.441194 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.441220 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.441240 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:58Z","lastTransitionTime":"2025-12-05T23:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.544648 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.544779 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.544798 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.544822 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.544839 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:58Z","lastTransitionTime":"2025-12-05T23:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.613144 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.613178 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.613255 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:20:58 crc kubenswrapper[4734]: E1205 23:20:58.613492 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:20:58 crc kubenswrapper[4734]: E1205 23:20:58.614098 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:20:58 crc kubenswrapper[4734]: E1205 23:20:58.614396 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.614786 4734 scope.go:117] "RemoveContainer" containerID="ccde57f4cb8d41050120cab8e9d3de18cee5141f9f3ae7bd5abf452b06c74e8c" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.615384 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:20:58 crc kubenswrapper[4734]: E1205 23:20:58.615778 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.647374 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.647427 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.647450 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.647479 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.647500 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:58Z","lastTransitionTime":"2025-12-05T23:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.750221 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.750265 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.750282 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.750307 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.750348 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:58Z","lastTransitionTime":"2025-12-05T23:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.853405 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.853464 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.853479 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.853503 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.853517 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:58Z","lastTransitionTime":"2025-12-05T23:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.956420 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.956473 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.956544 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.956567 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:58 crc kubenswrapper[4734]: I1205 23:20:58.956586 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:58Z","lastTransitionTime":"2025-12-05T23:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.063543 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.063596 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.063609 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.063639 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.063658 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:59Z","lastTransitionTime":"2025-12-05T23:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.136124 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8bfg7_2927a376-2f69-4820-a222-b86f08ece55a/ovnkube-controller/2.log" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.139102 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" event={"ID":"2927a376-2f69-4820-a222-b86f08ece55a","Type":"ContainerStarted","Data":"9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549"} Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.139646 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.159184 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.166197 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.166248 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.166260 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.166280 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.166294 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:59Z","lastTransitionTime":"2025-12-05T23:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.174367 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.201374 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ed109ca95328fcc458e818da95462a941b14b4a4ad494d73190e64ec494c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.216754 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.231742 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.245089 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.258816 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.269072 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.269118 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.269130 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.269147 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.269163 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:59Z","lastTransitionTime":"2025-12-05T23:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.274683 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.291112 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b089eaa-85b7-420d-914f-b053257be3c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32d72232eb5162100a1a381e51548864fc732ff00fd26239351ec294328fc7fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9676ebc0e731c50baebbab917a9dc814ceea006a370980021eaeb8bf822825b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be62799b986e89e6324a37ffed14cfc15d4fa6efec043e842534075da2b7547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff9c5cbf877fad8c2d4155cab3be27491de84cf4b7f3476f60a02de39936ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff9c5cbf877fad8c2d4155cab3be27491de84cf4b7f3476f60a02de39936ab51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.307841 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.323455 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a85cf646-baec-45c1-a31e-97ce9e087c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a4d2f938eb5aab362754086f82c0bb45b25e167e76d2dbe7192c92982ea9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93325400e317291da2931220b981cce963abd9cf3cb36d1959f19d136c0d2134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wdk8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.337165 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l6r6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"641af4fe-dd54-4118-8985-d37a03d64f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l6r6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.351970 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.371452 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.372044 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.372080 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.372092 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.372110 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.372124 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:59Z","lastTransitionTime":"2025-12-05T23:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.399888 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.415088 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8453d43131f407bdf61410dd38713b44aea86c8647825551f40b2c41552206e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:47Z\\\",\\\"message\\\":\\\"2025-12-05T23:20:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_90161e0f-9906-4827-895c-9cd783dd3007\\\\n2025-12-05T23:20:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_90161e0f-9906-4827-895c-9cd783dd3007 to /host/opt/cni/bin/\\\\n2025-12-05T23:20:02Z [verbose] multus-daemon started\\\\n2025-12-05T23:20:02Z [verbose] Readiness Indicator file check\\\\n2025-12-05T23:20:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.434564 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccde57f4cb8d41050120cab8e9d3de18cee5141f9f3ae7bd5abf452b06c74e8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:30Z\\\",\\\"message\\\":\\\"id == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 23:20:29.708321 6322 services_controller.go:445] Built service openshift-dns-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF1205 23:20:29.708749 6322 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.474248 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.474295 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.474309 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.474327 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.474342 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:59Z","lastTransitionTime":"2025-12-05T23:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.577255 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.577308 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.577321 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.577338 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.577352 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:59Z","lastTransitionTime":"2025-12-05T23:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.631449 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a85cf646-baec-45c1-a31e-97ce9e087c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a4d2f938eb5aab362754086f82c0bb45b25e167e76d2dbe7192c92982ea9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93325400e317291da2931220b981cce963abd9cf3cb36d1959f19d136c0d2134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wdk8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.647065 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.668481 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.681405 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.681494 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.681563 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.681623 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.681645 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:59Z","lastTransitionTime":"2025-12-05T23:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.687638 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8453d43131f407bdf61410dd38713b44aea86c8647825551f40b2c41552206e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:47Z\\\",\\\"message\\\":\\\"2025-12-05T23:20:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_90161e0f-9906-4827-895c-9cd783dd3007\\\\n2025-12-05T23:20:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_90161e0f-9906-4827-895c-9cd783dd3007 to /host/opt/cni/bin/\\\\n2025-12-05T23:20:02Z [verbose] multus-daemon started\\\\n2025-12-05T23:20:02Z [verbose] Readiness Indicator file check\\\\n2025-12-05T23:20:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.718627 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccde57f4cb8d41050120cab8e9d3de18cee5141f9f3ae7bd5abf452b06c74e8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:30Z\\\",\\\"message\\\":\\\"id == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 23:20:29.708321 6322 services_controller.go:445] Built service openshift-dns-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF1205 23:20:29.708749 6322 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.742099 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l6r6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"641af4fe-dd54-4118-8985-d37a03d64f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l6r6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.757215 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.770838 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.782378 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.784648 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.784717 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.784729 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.784751 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.784764 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:59Z","lastTransitionTime":"2025-12-05T23:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.797319 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.810488 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.827569 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.845314 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.869812 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ed109ca95328fcc458e818da95462a941b14b4a4ad494d73190e64ec494c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.887896 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.887953 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.887967 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.887990 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.888004 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:59Z","lastTransitionTime":"2025-12-05T23:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.890581 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.907092 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.922137 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b089eaa-85b7-420d-914f-b053257be3c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32d72232eb5162100a1a381e51548864fc732ff00fd26239351ec294328fc7fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9676ebc0e731c50baebbab917a9dc814ceea006a370980021eaeb8bf822825b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be62799b986e89e6324a37ffed14cfc15d4fa6efec043e842534075da2b7547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff9c5cbf877fad8c2d4155cab3be27491de84cf4b7f3476f60a02de39936ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff9c5cbf877fad8c2d4155cab3be27491de84cf4b7f3476f60a02de39936ab51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:20:59Z is after 2025-08-24T17:21:41Z" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.991721 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.991797 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.991816 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.991849 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:20:59 crc kubenswrapper[4734]: I1205 23:20:59.991874 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:20:59Z","lastTransitionTime":"2025-12-05T23:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.095159 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.095223 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.095236 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.095263 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.095281 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:00Z","lastTransitionTime":"2025-12-05T23:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.198694 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.198761 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.198778 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.198803 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.198819 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:00Z","lastTransitionTime":"2025-12-05T23:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.303434 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.303498 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.303512 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.303571 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.303584 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:00Z","lastTransitionTime":"2025-12-05T23:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.406749 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.406817 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.406837 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.406864 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.406884 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:00Z","lastTransitionTime":"2025-12-05T23:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.511014 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.511084 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.511106 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.511139 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.511159 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:00Z","lastTransitionTime":"2025-12-05T23:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.613083 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.613175 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.613238 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.613255 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:00 crc kubenswrapper[4734]: E1205 23:21:00.613346 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:21:00 crc kubenswrapper[4734]: E1205 23:21:00.613512 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:21:00 crc kubenswrapper[4734]: E1205 23:21:00.613669 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:21:00 crc kubenswrapper[4734]: E1205 23:21:00.613759 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.614591 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.614647 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.614667 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.614698 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.614728 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:00Z","lastTransitionTime":"2025-12-05T23:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.717737 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.717795 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.717813 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.717834 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.717848 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:00Z","lastTransitionTime":"2025-12-05T23:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.821487 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.821580 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.821595 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.821616 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.821632 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:00Z","lastTransitionTime":"2025-12-05T23:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.925668 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.925721 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.925731 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.925748 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:00 crc kubenswrapper[4734]: I1205 23:21:00.925758 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:00Z","lastTransitionTime":"2025-12-05T23:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.028871 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.028932 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.028950 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.028974 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.028992 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:01Z","lastTransitionTime":"2025-12-05T23:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.133048 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.134107 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.134179 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.134306 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.134370 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:01Z","lastTransitionTime":"2025-12-05T23:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.148519 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8bfg7_2927a376-2f69-4820-a222-b86f08ece55a/ovnkube-controller/3.log" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.149273 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8bfg7_2927a376-2f69-4820-a222-b86f08ece55a/ovnkube-controller/2.log" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.153086 4734 generic.go:334] "Generic (PLEG): container finished" podID="2927a376-2f69-4820-a222-b86f08ece55a" containerID="9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549" exitCode=1 Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.153158 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" event={"ID":"2927a376-2f69-4820-a222-b86f08ece55a","Type":"ContainerDied","Data":"9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549"} Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.153241 4734 scope.go:117] "RemoveContainer" containerID="ccde57f4cb8d41050120cab8e9d3de18cee5141f9f3ae7bd5abf452b06c74e8c" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.154609 4734 scope.go:117] "RemoveContainer" containerID="9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549" Dec 05 23:21:01 crc kubenswrapper[4734]: E1205 23:21:01.154782 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8bfg7_openshift-ovn-kubernetes(2927a376-2f69-4820-a222-b86f08ece55a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" podUID="2927a376-2f69-4820-a222-b86f08ece55a" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.174130 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7414d8e5-13fa-40b1-b442-3ceee2425ee1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:21:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.189624 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:21:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.201403 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9l87s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6eebee8c-1183-4010-b59c-8f880a4e669d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b6c6b8505646feac77ac9d5fa758360c9f9a9f721ee74b52f449ec8ed30dba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rh74z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9l87s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:21:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.219061 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d6kmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d76dc4e-40f3-4457-9a99-16f9a8ca8081\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8453d43131f407bdf61410dd38713b44aea86c8647825551f40b2c41552206e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:47Z\\\",\\\"message\\\":\\\"2025-12-05T23:20:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_90161e0f-9906-4827-895c-9cd783dd3007\\\\n2025-12-05T23:20:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_90161e0f-9906-4827-895c-9cd783dd3007 to /host/opt/cni/bin/\\\\n2025-12-05T23:20:02Z [verbose] multus-daemon started\\\\n2025-12-05T23:20:02Z [verbose] Readiness Indicator file check\\\\n2025-12-05T23:20:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js9qp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d6kmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:21:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.236472 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.236507 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.236520 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.236559 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.236571 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:01Z","lastTransitionTime":"2025-12-05T23:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.239301 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2927a376-2f69-4820-a222-b86f08ece55a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccde57f4cb8d41050120cab8e9d3de18cee5141f9f3ae7bd5abf452b06c74e8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:20:30Z\\\",\\\"message\\\":\\\"id == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1205 23:20:29.708321 6322 services_controller.go:445] Built service openshift-dns-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF1205 23:20:29.708749 6322 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T23:21:00Z\\\",\\\"message\\\":\\\"ices.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"ba175bbe-5cc4-47e6-a32d-57693e1320bd\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 23:20:59.549988 6704 services_controller.go:360] Finished syncing service packageserver-service on\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ssw64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8bfg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:21:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.253575 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l6r6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"641af4fe-dd54-4118-8985-d37a03d64f79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcvhz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l6r6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:21:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.268164 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082839cfb65e8fad77cd36c44dc30ee12482036a3bb6e61f0cdafa2bb8370ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:21:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.281351 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:21:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.292592 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c14bdf9de3cac15f0fff38f916e8da01527893739df49f94b97d7aebc76875a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:21:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.305867 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914df4e052706dcf1487cad9287cf46b28781f9720235c6774fa36ee818cb7b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1793e8462b8065541883b1564a5e41f3535f80c0021b63a8a90a7522e3586c18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:21:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.317583 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bfxx2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f57d8d-f8e7-4ccc-b41f-26ebca61d0f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2937461b56d6a54bf46d04d1246ef99a00bcc8072b52ccc25001376a3b640fb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8tnw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bfxx2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:21:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.331688 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65758270-a7a7-46b5-af95-0588daf9fa86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f761cb9e068ee2d46de1b4604f8403e36d7d0d7b8133f0fcb0da1f312f1ef704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m65jq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vn94d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:21:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.339588 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.339636 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.339650 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.339667 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.339681 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:01Z","lastTransitionTime":"2025-12-05T23:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.348443 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k52tb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7da080e9-7084-4e77-9e1a-051dc8b97f25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ed109ca95328fcc458e818da95462a941b14b4a4ad494d73190e64ec494c2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a183e6ae6fa8f55f5c605a8551360650285076e77cd2605119b1d848405bbe1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e5e41c729d8b80d1fab3c2bf7d71b2713275377ecc294c6144ef2152be83c87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c845089ad94c77fa229b9cf9604b0d96b2b2eb632ce3b195e6f72f932651a328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e895bccf48584216de92f33fe30a990f0f2e5cd6acf7d1115b8e8d0c36be5112\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc56c859e510064fa65514f5e39f2befb81f233287892d5c8d18a9f844457db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95fc33d5410e525d398beda77541a32096be9cb9a3f3c45c9a9eb6dca883d9a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2r4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k52tb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:21:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.363497 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:21:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.377194 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b089eaa-85b7-420d-914f-b053257be3c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32d72232eb5162100a1a381e51548864fc732ff00fd26239351ec294328fc7fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9676ebc0e731c50baebbab917a9dc814ceea006a370980021eaeb8bf822825b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be62799b986e89e6324a37ffed14cfc15d4fa6efec043e842534075da2b7547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff9c5cbf877fad8c2d4155cab3be27491de84cf4b7f3476f60a02de39936ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff9c5cbf877fad8c2d4155cab3be27491de84cf4b7f3476f60a02de39936ab51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:21:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.393453 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:21:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.406484 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a85cf646-baec-45c1-a31e-97ce9e087c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a4d2f938eb5aab362754086f82c0bb45b25e167e76d2dbe7192c92982ea9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93325400e317291da2931220b981cce963abd9cf3cb36d1959f19d136c0d2134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wdk8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:21:01Z is after 2025-08-24T17:21:41Z" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.442630 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.442693 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.442709 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.442733 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.442752 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:01Z","lastTransitionTime":"2025-12-05T23:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.546089 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.546173 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.546197 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.546228 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.546253 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:01Z","lastTransitionTime":"2025-12-05T23:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.648655 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.648697 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.648710 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.648729 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.648739 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:01Z","lastTransitionTime":"2025-12-05T23:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.752246 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.752340 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.752375 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.752415 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.752441 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:01Z","lastTransitionTime":"2025-12-05T23:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.855693 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.855763 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.855783 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.855807 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.855828 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:01Z","lastTransitionTime":"2025-12-05T23:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.959366 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.959438 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.959455 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.959482 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:01 crc kubenswrapper[4734]: I1205 23:21:01.959502 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:01Z","lastTransitionTime":"2025-12-05T23:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.063119 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.063176 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.063186 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.063203 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.063218 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:02Z","lastTransitionTime":"2025-12-05T23:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.093243 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:21:02 crc kubenswrapper[4734]: E1205 23:21:02.093426 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:06.093391784 +0000 UTC m=+146.776796100 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.093479 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.093645 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:02 crc kubenswrapper[4734]: E1205 23:21:02.093756 4734 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 23:21:02 crc kubenswrapper[4734]: E1205 23:21:02.093773 4734 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 23:21:02 crc kubenswrapper[4734]: E1205 23:21:02.093856 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 23:22:06.093838645 +0000 UTC m=+146.777242951 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 23:21:02 crc kubenswrapper[4734]: E1205 23:21:02.093883 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 23:22:06.093870546 +0000 UTC m=+146.777274852 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.161276 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8bfg7_2927a376-2f69-4820-a222-b86f08ece55a/ovnkube-controller/3.log" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.165375 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.165419 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.165429 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.165447 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.165458 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:02Z","lastTransitionTime":"2025-12-05T23:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.194280 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.194345 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:02 crc kubenswrapper[4734]: E1205 23:21:02.194457 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 23:21:02 crc kubenswrapper[4734]: E1205 23:21:02.194469 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 23:21:02 crc kubenswrapper[4734]: E1205 23:21:02.194516 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 23:21:02 crc kubenswrapper[4734]: E1205 23:21:02.194567 4734 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:21:02 crc kubenswrapper[4734]: E1205 23:21:02.194648 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 23:22:06.194623842 +0000 UTC m=+146.878028148 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:21:02 crc kubenswrapper[4734]: E1205 23:21:02.194477 4734 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 23:21:02 crc kubenswrapper[4734]: E1205 23:21:02.194699 4734 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:21:02 crc kubenswrapper[4734]: E1205 23:21:02.194773 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 23:22:06.194751875 +0000 UTC m=+146.878156191 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.268161 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.268247 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.268265 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.268290 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.268307 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:02Z","lastTransitionTime":"2025-12-05T23:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.372337 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.372392 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.372403 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.372428 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.372441 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:02Z","lastTransitionTime":"2025-12-05T23:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.475871 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.475943 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.475963 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.475995 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.476016 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:02Z","lastTransitionTime":"2025-12-05T23:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.579278 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.579337 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.579354 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.579376 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.579391 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:02Z","lastTransitionTime":"2025-12-05T23:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.613240 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.613297 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.613254 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.613418 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:02 crc kubenswrapper[4734]: E1205 23:21:02.613619 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:21:02 crc kubenswrapper[4734]: E1205 23:21:02.613783 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:21:02 crc kubenswrapper[4734]: E1205 23:21:02.613967 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:21:02 crc kubenswrapper[4734]: E1205 23:21:02.614068 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.682278 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.682429 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.682453 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.682951 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.682982 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:02Z","lastTransitionTime":"2025-12-05T23:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.786726 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.786786 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.786797 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.786816 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.786828 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:02Z","lastTransitionTime":"2025-12-05T23:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.890435 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.890554 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.890580 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.890644 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.890667 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:02Z","lastTransitionTime":"2025-12-05T23:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.994097 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.994159 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.994175 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.994198 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:02 crc kubenswrapper[4734]: I1205 23:21:02.994215 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:02Z","lastTransitionTime":"2025-12-05T23:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.097363 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.097414 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.097425 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.097444 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.097455 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:03Z","lastTransitionTime":"2025-12-05T23:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.200754 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.200813 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.200828 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.200846 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.200861 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:03Z","lastTransitionTime":"2025-12-05T23:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.304510 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.304610 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.304632 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.304659 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.304678 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:03Z","lastTransitionTime":"2025-12-05T23:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.412635 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.412739 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.412759 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.412784 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.412804 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:03Z","lastTransitionTime":"2025-12-05T23:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.515709 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.515796 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.515814 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.515842 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.515862 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:03Z","lastTransitionTime":"2025-12-05T23:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.619078 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.619166 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.619194 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.619226 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.619251 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:03Z","lastTransitionTime":"2025-12-05T23:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.723760 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.723843 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.723863 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.723897 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.723917 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:03Z","lastTransitionTime":"2025-12-05T23:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.828138 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.828216 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.828237 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.828267 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.828291 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:03Z","lastTransitionTime":"2025-12-05T23:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.931193 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.931236 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.931245 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.931265 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:03 crc kubenswrapper[4734]: I1205 23:21:03.931277 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:03Z","lastTransitionTime":"2025-12-05T23:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.033847 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.033883 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.033894 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.033912 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.033922 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:04Z","lastTransitionTime":"2025-12-05T23:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.136757 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.136827 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.136841 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.136869 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.136884 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:04Z","lastTransitionTime":"2025-12-05T23:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.220886 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.220943 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.220955 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.220977 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.220991 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:04Z","lastTransitionTime":"2025-12-05T23:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:04 crc kubenswrapper[4734]: E1205 23:21:04.236611 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:21:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.241007 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.241077 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.241086 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.241115 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.241128 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:04Z","lastTransitionTime":"2025-12-05T23:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:04 crc kubenswrapper[4734]: E1205 23:21:04.255679 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:21:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.260372 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.260413 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.260422 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.260439 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.260450 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:04Z","lastTransitionTime":"2025-12-05T23:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:04 crc kubenswrapper[4734]: E1205 23:21:04.276958 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:21:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.281836 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.281898 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.281917 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.281942 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.281999 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:04Z","lastTransitionTime":"2025-12-05T23:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:04 crc kubenswrapper[4734]: E1205 23:21:04.304089 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:21:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.309232 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.309315 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.309339 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.309373 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.309393 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:04Z","lastTransitionTime":"2025-12-05T23:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:04 crc kubenswrapper[4734]: E1205 23:21:04.324593 4734 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T23:21:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"bba22b9d-56b5-49db-9757-30928c54213a\\\",\\\"systemUUID\\\":\\\"33f74fdf-48ac-436c-92bc-f6724ef71400\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:21:04Z is after 2025-08-24T17:21:41Z" Dec 05 23:21:04 crc kubenswrapper[4734]: E1205 23:21:04.324858 4734 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.327458 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.327503 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.327545 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.327570 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.327585 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:04Z","lastTransitionTime":"2025-12-05T23:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.431052 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.431123 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.431134 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.431154 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.431168 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:04Z","lastTransitionTime":"2025-12-05T23:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.534266 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.534322 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.534335 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.534357 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.534370 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:04Z","lastTransitionTime":"2025-12-05T23:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.614068 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.614157 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.614184 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.614184 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:04 crc kubenswrapper[4734]: E1205 23:21:04.614268 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:21:04 crc kubenswrapper[4734]: E1205 23:21:04.614352 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:21:04 crc kubenswrapper[4734]: E1205 23:21:04.614476 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:21:04 crc kubenswrapper[4734]: E1205 23:21:04.614804 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.637343 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.637397 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.637411 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.637431 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.637444 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:04Z","lastTransitionTime":"2025-12-05T23:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.740333 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.740686 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.740699 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.740721 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.740734 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:04Z","lastTransitionTime":"2025-12-05T23:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.843138 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.843199 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.843210 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.843233 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.843245 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:04Z","lastTransitionTime":"2025-12-05T23:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.946287 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.946358 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.946379 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.946405 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:04 crc kubenswrapper[4734]: I1205 23:21:04.946424 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:04Z","lastTransitionTime":"2025-12-05T23:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.049263 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.049317 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.049329 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.049350 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.049362 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:05Z","lastTransitionTime":"2025-12-05T23:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.152580 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.152660 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.152677 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.152700 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.152733 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:05Z","lastTransitionTime":"2025-12-05T23:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.256870 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.256952 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.256974 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.257006 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.257033 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:05Z","lastTransitionTime":"2025-12-05T23:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.359865 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.359934 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.359957 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.359990 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.360014 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:05Z","lastTransitionTime":"2025-12-05T23:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.463616 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.463684 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.463702 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.463725 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.463748 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:05Z","lastTransitionTime":"2025-12-05T23:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.567267 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.567349 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.567375 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.567410 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.567436 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:05Z","lastTransitionTime":"2025-12-05T23:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.670375 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.670477 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.670505 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.670578 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.670606 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:05Z","lastTransitionTime":"2025-12-05T23:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.774623 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.774696 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.774714 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.774746 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.774771 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:05Z","lastTransitionTime":"2025-12-05T23:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.878695 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.878767 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.878786 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.878812 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.878829 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:05Z","lastTransitionTime":"2025-12-05T23:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.982492 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.982574 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.982584 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.982607 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:05 crc kubenswrapper[4734]: I1205 23:21:05.982619 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:05Z","lastTransitionTime":"2025-12-05T23:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.085389 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.085481 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.085505 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.085603 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.085632 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:06Z","lastTransitionTime":"2025-12-05T23:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.188643 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.188734 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.188761 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.188794 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.188823 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:06Z","lastTransitionTime":"2025-12-05T23:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.292267 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.292326 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.292339 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.292360 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.292376 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:06Z","lastTransitionTime":"2025-12-05T23:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.395880 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.395978 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.396003 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.396035 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.396061 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:06Z","lastTransitionTime":"2025-12-05T23:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.499594 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.499678 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.499701 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.499731 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.499754 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:06Z","lastTransitionTime":"2025-12-05T23:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.603342 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.603429 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.603452 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.603489 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.603512 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:06Z","lastTransitionTime":"2025-12-05T23:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.613726 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.613839 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.613766 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:06 crc kubenswrapper[4734]: E1205 23:21:06.613974 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.614042 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:06 crc kubenswrapper[4734]: E1205 23:21:06.614185 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:21:06 crc kubenswrapper[4734]: E1205 23:21:06.614394 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:21:06 crc kubenswrapper[4734]: E1205 23:21:06.614676 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.707666 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.707741 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.707758 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.707807 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.707835 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:06Z","lastTransitionTime":"2025-12-05T23:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.811028 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.811089 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.811105 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.811124 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.811139 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:06Z","lastTransitionTime":"2025-12-05T23:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.914909 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.915005 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.915040 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.915072 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:06 crc kubenswrapper[4734]: I1205 23:21:06.915096 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:06Z","lastTransitionTime":"2025-12-05T23:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.018282 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.018329 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.018341 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.018358 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.018370 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:07Z","lastTransitionTime":"2025-12-05T23:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.121666 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.121723 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.121733 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.121757 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.121771 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:07Z","lastTransitionTime":"2025-12-05T23:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.224606 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.224678 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.224699 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.224726 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.224743 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:07Z","lastTransitionTime":"2025-12-05T23:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.327847 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.327899 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.327916 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.327935 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.327946 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:07Z","lastTransitionTime":"2025-12-05T23:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.430428 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.430479 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.430492 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.430512 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.430545 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:07Z","lastTransitionTime":"2025-12-05T23:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.532711 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.532773 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.532784 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.532804 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.533067 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:07Z","lastTransitionTime":"2025-12-05T23:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.640128 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.640178 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.640189 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.640204 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.640216 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:07Z","lastTransitionTime":"2025-12-05T23:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.743049 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.743121 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.743131 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.743151 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.743162 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:07Z","lastTransitionTime":"2025-12-05T23:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.845306 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.845350 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.845359 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.845375 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.845387 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:07Z","lastTransitionTime":"2025-12-05T23:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.948321 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.948372 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.948383 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.948403 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:07 crc kubenswrapper[4734]: I1205 23:21:07.948418 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:07Z","lastTransitionTime":"2025-12-05T23:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.052448 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.052545 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.052560 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.052585 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.052598 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:08Z","lastTransitionTime":"2025-12-05T23:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.156993 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.157076 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.157094 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.157121 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.157140 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:08Z","lastTransitionTime":"2025-12-05T23:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.260611 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.260678 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.260690 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.260707 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.260718 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:08Z","lastTransitionTime":"2025-12-05T23:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.365108 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.365207 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.365227 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.365286 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.365305 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:08Z","lastTransitionTime":"2025-12-05T23:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.467886 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.467998 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.468045 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.468069 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.468086 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:08Z","lastTransitionTime":"2025-12-05T23:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.571442 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.571555 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.571575 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.571604 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.571624 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:08Z","lastTransitionTime":"2025-12-05T23:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.614001 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.614028 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.614223 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:08 crc kubenswrapper[4734]: E1205 23:21:08.614413 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.614442 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:08 crc kubenswrapper[4734]: E1205 23:21:08.614672 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:21:08 crc kubenswrapper[4734]: E1205 23:21:08.614768 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:21:08 crc kubenswrapper[4734]: E1205 23:21:08.614864 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.674434 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.674490 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.674502 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.674552 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.674568 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:08Z","lastTransitionTime":"2025-12-05T23:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.777319 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.777363 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.777374 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.777402 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.777414 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:08Z","lastTransitionTime":"2025-12-05T23:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.880737 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.880790 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.880801 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.880820 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.880832 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:08Z","lastTransitionTime":"2025-12-05T23:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.983403 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.983460 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.983472 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.983493 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:08 crc kubenswrapper[4734]: I1205 23:21:08.983507 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:08Z","lastTransitionTime":"2025-12-05T23:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.086049 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.086083 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.086093 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.086109 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.086118 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:09Z","lastTransitionTime":"2025-12-05T23:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.188319 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.188398 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.188417 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.188454 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.188472 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:09Z","lastTransitionTime":"2025-12-05T23:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.291319 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.291373 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.291386 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.291408 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.291419 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:09Z","lastTransitionTime":"2025-12-05T23:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.394490 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.394566 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.394575 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.394600 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.394620 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:09Z","lastTransitionTime":"2025-12-05T23:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.496601 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.496663 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.496676 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.496697 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.496710 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:09Z","lastTransitionTime":"2025-12-05T23:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.599173 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.599263 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.599282 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.599321 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.599342 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:09Z","lastTransitionTime":"2025-12-05T23:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.635626 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a44d00-5d9b-41b8-92da-5fb007474364\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6654d84cde342187d1f22ceb9d9a0071d20db5499940f237b891eb0340acef5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec9fa0fe762003b269fb6cc776748dfe960734f5d3aeff0482643e41f4e6e71a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd84c2cc93cd524a14d5c6504ef3dce1609072424c2b6da3932a1b184d533aa3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:21:09Z is after 2025-08-24T17:21:41Z" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.652845 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b089eaa-85b7-420d-914f-b053257be3c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32d72232eb5162100a1a381e51548864fc732ff00fd26239351ec294328fc7fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9676ebc0e731c50baebbab917a9dc814ceea006a370980021eaeb8bf822825b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6be62799b986e89e6324a37ffed14cfc15d4fa6efec043e842534075da2b7547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff9c5cbf877fad8c2d4155cab3be27491de84cf4b7f3476f60a02de39936ab51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff9c5cbf877fad8c2d4155cab3be27491de84cf4b7f3476f60a02de39936ab51\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T23:19:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T23:19:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:19:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:21:09Z is after 2025-08-24T17:21:41Z" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.669954 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T23:19:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:21:09Z is after 2025-08-24T17:21:41Z" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.684281 4734 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a85cf646-baec-45c1-a31e-97ce9e087c69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T23:20:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a4d2f938eb5aab362754086f82c0bb45b25e167e76d2dbe7192c92982ea9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93325400e317291da2931220b981cce963abd9cf3cb36d1959f19d136c0d2134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T23:20:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tqrvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T23:20:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wdk8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T23:21:09Z is after 2025-08-24T17:21:41Z" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.702673 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.702801 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.702822 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.702855 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.702888 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:09Z","lastTransitionTime":"2025-12-05T23:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.730833 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=73.73080236 podStartE2EDuration="1m13.73080236s" podCreationTimestamp="2025-12-05 23:19:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:21:09.715181345 +0000 UTC m=+90.398585641" watchObservedRunningTime="2025-12-05 23:21:09.73080236 +0000 UTC m=+90.414206646" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.742638 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9l87s" podStartSLOduration=69.742615644 podStartE2EDuration="1m9.742615644s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:21:09.742434289 +0000 UTC m=+90.425838575" watchObservedRunningTime="2025-12-05 23:21:09.742615644 +0000 UTC m=+90.426019920" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.783838 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-d6kmh" podStartSLOduration=69.783810671 podStartE2EDuration="1m9.783810671s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:21:09.761158168 +0000 UTC m=+90.444562444" watchObservedRunningTime="2025-12-05 23:21:09.783810671 +0000 UTC m=+90.467214957" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.805480 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.805567 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.805590 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.805614 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.805628 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:09Z","lastTransitionTime":"2025-12-05T23:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.812276 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-k52tb" podStartSLOduration=69.812242283 podStartE2EDuration="1m9.812242283s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:21:09.812225822 +0000 UTC m=+90.495630108" watchObservedRunningTime="2025-12-05 23:21:09.812242283 +0000 UTC m=+90.495646559" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.890396 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-bfxx2" podStartSLOduration=69.890361376 podStartE2EDuration="1m9.890361376s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:21:09.889602289 +0000 UTC m=+90.573006565" watchObservedRunningTime="2025-12-05 23:21:09.890361376 +0000 UTC m=+90.573765692" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.903239 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podStartSLOduration=69.903216445 podStartE2EDuration="1m9.903216445s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:21:09.902736623 +0000 UTC m=+90.586140929" watchObservedRunningTime="2025-12-05 23:21:09.903216445 +0000 UTC m=+90.586620761" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.908265 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.908335 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.908356 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.908380 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:09 crc kubenswrapper[4734]: I1205 23:21:09.908400 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:09Z","lastTransitionTime":"2025-12-05T23:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.010956 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.011019 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.011032 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.011057 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.011072 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:10Z","lastTransitionTime":"2025-12-05T23:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.114226 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.114287 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.114302 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.114323 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.114334 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:10Z","lastTransitionTime":"2025-12-05T23:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.217254 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.217322 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.217343 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.217369 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.217389 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:10Z","lastTransitionTime":"2025-12-05T23:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.320973 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.321052 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.321101 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.321130 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.321154 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:10Z","lastTransitionTime":"2025-12-05T23:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.424196 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.424275 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.424296 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.424324 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.424345 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:10Z","lastTransitionTime":"2025-12-05T23:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.528947 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.529003 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.529015 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.529041 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.529055 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:10Z","lastTransitionTime":"2025-12-05T23:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.613438 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.613587 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.613697 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.613988 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:10 crc kubenswrapper[4734]: E1205 23:21:10.614185 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:21:10 crc kubenswrapper[4734]: E1205 23:21:10.614390 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:21:10 crc kubenswrapper[4734]: E1205 23:21:10.614571 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:21:10 crc kubenswrapper[4734]: E1205 23:21:10.614784 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.633177 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.633227 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.633237 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.633255 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.633266 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:10Z","lastTransitionTime":"2025-12-05T23:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.633689 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.736587 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.736678 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.736706 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.736739 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.736767 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:10Z","lastTransitionTime":"2025-12-05T23:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.839657 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.839725 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.839744 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.839774 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.839796 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:10Z","lastTransitionTime":"2025-12-05T23:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.943155 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.943221 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.943239 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.943269 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:10 crc kubenswrapper[4734]: I1205 23:21:10.943289 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:10Z","lastTransitionTime":"2025-12-05T23:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.046815 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.046884 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.046907 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.046937 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.046998 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:11Z","lastTransitionTime":"2025-12-05T23:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.150662 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.150788 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.150815 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.150857 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.150881 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:11Z","lastTransitionTime":"2025-12-05T23:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.253690 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.253766 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.253781 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.253821 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.253832 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:11Z","lastTransitionTime":"2025-12-05T23:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.356811 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.356877 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.356894 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.356919 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.356939 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:11Z","lastTransitionTime":"2025-12-05T23:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.460922 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.460989 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.461008 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.461034 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.461054 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:11Z","lastTransitionTime":"2025-12-05T23:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.564985 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.565088 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.565115 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.565156 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.565184 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:11Z","lastTransitionTime":"2025-12-05T23:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.668518 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.668710 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.668741 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.668771 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.668795 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:11Z","lastTransitionTime":"2025-12-05T23:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.771831 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.771892 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.771902 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.771920 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.771933 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:11Z","lastTransitionTime":"2025-12-05T23:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.875696 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.875823 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.875889 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.875963 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.875989 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:11Z","lastTransitionTime":"2025-12-05T23:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.979920 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.980029 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.980049 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.980112 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:11 crc kubenswrapper[4734]: I1205 23:21:11.980131 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:11Z","lastTransitionTime":"2025-12-05T23:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.083878 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.083931 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.083942 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.083969 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.083982 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:12Z","lastTransitionTime":"2025-12-05T23:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.187428 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.187500 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.187569 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.187608 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.187631 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:12Z","lastTransitionTime":"2025-12-05T23:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.292410 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.292487 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.292515 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.292585 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.292610 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:12Z","lastTransitionTime":"2025-12-05T23:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.397448 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.397520 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.397563 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.397588 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.397607 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:12Z","lastTransitionTime":"2025-12-05T23:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.500799 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.500866 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.500886 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.500914 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.500936 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:12Z","lastTransitionTime":"2025-12-05T23:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.604409 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.604498 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.604512 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.604557 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.604572 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:12Z","lastTransitionTime":"2025-12-05T23:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.613347 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.613398 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.613398 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.613368 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:12 crc kubenswrapper[4734]: E1205 23:21:12.613540 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:21:12 crc kubenswrapper[4734]: E1205 23:21:12.613667 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:21:12 crc kubenswrapper[4734]: E1205 23:21:12.613784 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:21:12 crc kubenswrapper[4734]: E1205 23:21:12.613890 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.708854 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.708929 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.708948 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.708977 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.708997 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:12Z","lastTransitionTime":"2025-12-05T23:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.812265 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.812336 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.812356 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.812384 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.812405 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:12Z","lastTransitionTime":"2025-12-05T23:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.916685 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.916797 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.916836 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.916880 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:12 crc kubenswrapper[4734]: I1205 23:21:12.916907 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:12Z","lastTransitionTime":"2025-12-05T23:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.020609 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.020683 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.020697 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.020719 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.020737 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:13Z","lastTransitionTime":"2025-12-05T23:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.124068 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.124142 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.124181 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.124221 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.124248 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:13Z","lastTransitionTime":"2025-12-05T23:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.227740 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.227837 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.227865 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.227906 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.227933 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:13Z","lastTransitionTime":"2025-12-05T23:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.330962 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.331026 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.331044 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.331070 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.331088 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:13Z","lastTransitionTime":"2025-12-05T23:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.434400 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.434472 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.434491 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.434519 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.434577 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:13Z","lastTransitionTime":"2025-12-05T23:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.539203 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.539270 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.539291 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.539317 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.539338 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:13Z","lastTransitionTime":"2025-12-05T23:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.642333 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.642420 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.642441 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.642468 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.642491 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:13Z","lastTransitionTime":"2025-12-05T23:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.746378 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.746453 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.746476 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.746509 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.746571 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:13Z","lastTransitionTime":"2025-12-05T23:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.849938 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.850003 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.850014 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.850036 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.850048 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:13Z","lastTransitionTime":"2025-12-05T23:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.953431 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.953506 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.953554 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.953580 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:13 crc kubenswrapper[4734]: I1205 23:21:13.953598 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:13Z","lastTransitionTime":"2025-12-05T23:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.056858 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.056940 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.056960 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.056998 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.057019 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:14Z","lastTransitionTime":"2025-12-05T23:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.160660 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.160729 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.160738 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.160758 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.160768 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:14Z","lastTransitionTime":"2025-12-05T23:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.263760 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.263794 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.263802 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.263817 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.263827 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:14Z","lastTransitionTime":"2025-12-05T23:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.366798 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.366871 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.366892 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.366921 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.366940 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:14Z","lastTransitionTime":"2025-12-05T23:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.470131 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.470241 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.470255 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.470276 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.470290 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:14Z","lastTransitionTime":"2025-12-05T23:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.574091 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.574168 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.574182 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.574203 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.574245 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:14Z","lastTransitionTime":"2025-12-05T23:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.613743 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.613797 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.613827 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.613889 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:14 crc kubenswrapper[4734]: E1205 23:21:14.613964 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:21:14 crc kubenswrapper[4734]: E1205 23:21:14.614093 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:21:14 crc kubenswrapper[4734]: E1205 23:21:14.614253 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:21:14 crc kubenswrapper[4734]: E1205 23:21:14.614395 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.677680 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.677746 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.677768 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.677797 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.677813 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:14Z","lastTransitionTime":"2025-12-05T23:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.713922 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.713979 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.713989 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.714009 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.714020 4734 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T23:21:14Z","lastTransitionTime":"2025-12-05T23:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.772833 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-2qbjc"] Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.774215 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2qbjc" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.781092 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.781144 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.781229 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.781145 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.803878 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=76.803848571 podStartE2EDuration="1m16.803848571s" podCreationTimestamp="2025-12-05 23:19:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:21:14.80256041 +0000 UTC m=+95.485964716" watchObservedRunningTime="2025-12-05 23:21:14.803848571 +0000 UTC m=+95.487252847" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.822190 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=47.82216694 podStartE2EDuration="47.82216694s" podCreationTimestamp="2025-12-05 23:20:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:21:14.821367532 +0000 UTC m=+95.504771808" watchObservedRunningTime="2025-12-05 23:21:14.82216694 +0000 UTC m=+95.505571226" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.854152 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wdk8s" podStartSLOduration=74.854125167 podStartE2EDuration="1m14.854125167s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:21:14.853878991 +0000 UTC m=+95.537283267" watchObservedRunningTime="2025-12-05 23:21:14.854125167 +0000 UTC m=+95.537529453" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.854825 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bd371e03-a06e-498c-8060-903f2e894cc2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2qbjc\" (UID: \"bd371e03-a06e-498c-8060-903f2e894cc2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2qbjc" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.854941 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bd371e03-a06e-498c-8060-903f2e894cc2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2qbjc\" (UID: \"bd371e03-a06e-498c-8060-903f2e894cc2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2qbjc" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.855065 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd371e03-a06e-498c-8060-903f2e894cc2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2qbjc\" (UID: \"bd371e03-a06e-498c-8060-903f2e894cc2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2qbjc" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.855202 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd371e03-a06e-498c-8060-903f2e894cc2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2qbjc\" (UID: \"bd371e03-a06e-498c-8060-903f2e894cc2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2qbjc" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.855268 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bd371e03-a06e-498c-8060-903f2e894cc2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2qbjc\" (UID: \"bd371e03-a06e-498c-8060-903f2e894cc2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2qbjc" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.881935 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=4.881910063 podStartE2EDuration="4.881910063s" podCreationTimestamp="2025-12-05 23:21:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:21:14.88095419 +0000 UTC m=+95.564358506" watchObservedRunningTime="2025-12-05 23:21:14.881910063 +0000 UTC m=+95.565314349" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.956571 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bd371e03-a06e-498c-8060-903f2e894cc2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2qbjc\" (UID: \"bd371e03-a06e-498c-8060-903f2e894cc2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2qbjc" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.956720 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bd371e03-a06e-498c-8060-903f2e894cc2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2qbjc\" (UID: \"bd371e03-a06e-498c-8060-903f2e894cc2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2qbjc" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.956762 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bd371e03-a06e-498c-8060-903f2e894cc2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2qbjc\" (UID: \"bd371e03-a06e-498c-8060-903f2e894cc2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2qbjc" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.956797 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd371e03-a06e-498c-8060-903f2e894cc2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2qbjc\" (UID: \"bd371e03-a06e-498c-8060-903f2e894cc2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2qbjc" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.956806 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bd371e03-a06e-498c-8060-903f2e894cc2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2qbjc\" (UID: \"bd371e03-a06e-498c-8060-903f2e894cc2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2qbjc" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.956899 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd371e03-a06e-498c-8060-903f2e894cc2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2qbjc\" (UID: \"bd371e03-a06e-498c-8060-903f2e894cc2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2qbjc" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.957049 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bd371e03-a06e-498c-8060-903f2e894cc2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2qbjc\" (UID: \"bd371e03-a06e-498c-8060-903f2e894cc2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2qbjc" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.958644 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bd371e03-a06e-498c-8060-903f2e894cc2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2qbjc\" (UID: \"bd371e03-a06e-498c-8060-903f2e894cc2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2qbjc" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.965698 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd371e03-a06e-498c-8060-903f2e894cc2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2qbjc\" (UID: \"bd371e03-a06e-498c-8060-903f2e894cc2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2qbjc" Dec 05 23:21:14 crc kubenswrapper[4734]: I1205 23:21:14.975470 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd371e03-a06e-498c-8060-903f2e894cc2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2qbjc\" (UID: \"bd371e03-a06e-498c-8060-903f2e894cc2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2qbjc" Dec 05 23:21:15 crc kubenswrapper[4734]: I1205 23:21:15.092654 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2qbjc" Dec 05 23:21:15 crc kubenswrapper[4734]: I1205 23:21:15.226928 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2qbjc" event={"ID":"bd371e03-a06e-498c-8060-903f2e894cc2","Type":"ContainerStarted","Data":"f523f9fde19cb89ec5cb9b9d3ae5559752fa2ebd0dfdd6b16dbc28c6ba242ebb"} Dec 05 23:21:15 crc kubenswrapper[4734]: I1205 23:21:15.614876 4734 scope.go:117] "RemoveContainer" containerID="9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549" Dec 05 23:21:15 crc kubenswrapper[4734]: E1205 23:21:15.615182 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8bfg7_openshift-ovn-kubernetes(2927a376-2f69-4820-a222-b86f08ece55a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" podUID="2927a376-2f69-4820-a222-b86f08ece55a" Dec 05 23:21:16 crc kubenswrapper[4734]: I1205 23:21:16.232408 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2qbjc" event={"ID":"bd371e03-a06e-498c-8060-903f2e894cc2","Type":"ContainerStarted","Data":"438d0b558993cd9ca43a92aa5be8a9d0056cd4fd4ee4a83e84708e7c03835f70"} Dec 05 23:21:16 crc kubenswrapper[4734]: I1205 23:21:16.254244 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2qbjc" podStartSLOduration=76.254223453 podStartE2EDuration="1m16.254223453s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:21:16.253196319 +0000 UTC m=+96.936600625" watchObservedRunningTime="2025-12-05 23:21:16.254223453 +0000 UTC m=+96.937627739" Dec 05 23:21:16 crc kubenswrapper[4734]: I1205 23:21:16.614035 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:16 crc kubenswrapper[4734]: I1205 23:21:16.614134 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:16 crc kubenswrapper[4734]: E1205 23:21:16.614252 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:21:16 crc kubenswrapper[4734]: I1205 23:21:16.614143 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:16 crc kubenswrapper[4734]: E1205 23:21:16.614339 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:21:16 crc kubenswrapper[4734]: I1205 23:21:16.614166 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:16 crc kubenswrapper[4734]: E1205 23:21:16.614463 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:21:16 crc kubenswrapper[4734]: E1205 23:21:16.614546 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:21:18 crc kubenswrapper[4734]: I1205 23:21:18.613541 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:18 crc kubenswrapper[4734]: I1205 23:21:18.613633 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:18 crc kubenswrapper[4734]: I1205 23:21:18.613720 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:18 crc kubenswrapper[4734]: E1205 23:21:18.613716 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:21:18 crc kubenswrapper[4734]: I1205 23:21:18.613559 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:18 crc kubenswrapper[4734]: E1205 23:21:18.613862 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:21:18 crc kubenswrapper[4734]: E1205 23:21:18.613975 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:21:18 crc kubenswrapper[4734]: E1205 23:21:18.614068 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:21:18 crc kubenswrapper[4734]: I1205 23:21:18.806061 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/641af4fe-dd54-4118-8985-d37a03d64f79-metrics-certs\") pod \"network-metrics-daemon-l6r6g\" (UID: \"641af4fe-dd54-4118-8985-d37a03d64f79\") " pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:18 crc kubenswrapper[4734]: E1205 23:21:18.806309 4734 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 23:21:18 crc kubenswrapper[4734]: E1205 23:21:18.806450 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/641af4fe-dd54-4118-8985-d37a03d64f79-metrics-certs podName:641af4fe-dd54-4118-8985-d37a03d64f79 nodeName:}" failed. No retries permitted until 2025-12-05 23:22:22.806420229 +0000 UTC m=+163.489824515 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/641af4fe-dd54-4118-8985-d37a03d64f79-metrics-certs") pod "network-metrics-daemon-l6r6g" (UID: "641af4fe-dd54-4118-8985-d37a03d64f79") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 23:21:19 crc kubenswrapper[4734]: I1205 23:21:19.635818 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 05 23:21:20 crc kubenswrapper[4734]: I1205 23:21:20.613205 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:20 crc kubenswrapper[4734]: I1205 23:21:20.613286 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:20 crc kubenswrapper[4734]: E1205 23:21:20.613395 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:21:20 crc kubenswrapper[4734]: I1205 23:21:20.613300 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:20 crc kubenswrapper[4734]: I1205 23:21:20.613479 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:20 crc kubenswrapper[4734]: E1205 23:21:20.613677 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:21:20 crc kubenswrapper[4734]: E1205 23:21:20.613593 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:21:20 crc kubenswrapper[4734]: E1205 23:21:20.613840 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:21:22 crc kubenswrapper[4734]: I1205 23:21:22.613199 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:22 crc kubenswrapper[4734]: I1205 23:21:22.613568 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:22 crc kubenswrapper[4734]: I1205 23:21:22.613628 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:22 crc kubenswrapper[4734]: E1205 23:21:22.613750 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:21:22 crc kubenswrapper[4734]: I1205 23:21:22.613882 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:22 crc kubenswrapper[4734]: E1205 23:21:22.614316 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:21:22 crc kubenswrapper[4734]: E1205 23:21:22.614040 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:21:22 crc kubenswrapper[4734]: E1205 23:21:22.614634 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:21:24 crc kubenswrapper[4734]: I1205 23:21:24.613184 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:24 crc kubenswrapper[4734]: I1205 23:21:24.613233 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:24 crc kubenswrapper[4734]: I1205 23:21:24.613264 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:24 crc kubenswrapper[4734]: I1205 23:21:24.613286 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:24 crc kubenswrapper[4734]: E1205 23:21:24.613405 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:21:24 crc kubenswrapper[4734]: E1205 23:21:24.613463 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:21:24 crc kubenswrapper[4734]: E1205 23:21:24.613556 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:21:24 crc kubenswrapper[4734]: E1205 23:21:24.613690 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:21:26 crc kubenswrapper[4734]: I1205 23:21:26.613427 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:26 crc kubenswrapper[4734]: E1205 23:21:26.614466 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:21:26 crc kubenswrapper[4734]: I1205 23:21:26.613591 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:26 crc kubenswrapper[4734]: I1205 23:21:26.613675 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:26 crc kubenswrapper[4734]: E1205 23:21:26.614945 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:21:26 crc kubenswrapper[4734]: I1205 23:21:26.613591 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:26 crc kubenswrapper[4734]: I1205 23:21:26.615012 4734 scope.go:117] "RemoveContainer" containerID="9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549" Dec 05 23:21:26 crc kubenswrapper[4734]: E1205 23:21:26.615021 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:21:26 crc kubenswrapper[4734]: E1205 23:21:26.614868 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:21:26 crc kubenswrapper[4734]: E1205 23:21:26.615316 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8bfg7_openshift-ovn-kubernetes(2927a376-2f69-4820-a222-b86f08ece55a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" podUID="2927a376-2f69-4820-a222-b86f08ece55a" Dec 05 23:21:28 crc kubenswrapper[4734]: I1205 23:21:28.613928 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:28 crc kubenswrapper[4734]: I1205 23:21:28.613989 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:28 crc kubenswrapper[4734]: I1205 23:21:28.614004 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:28 crc kubenswrapper[4734]: I1205 23:21:28.614176 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:28 crc kubenswrapper[4734]: E1205 23:21:28.614211 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:21:28 crc kubenswrapper[4734]: E1205 23:21:28.614331 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:21:28 crc kubenswrapper[4734]: E1205 23:21:28.614484 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:21:28 crc kubenswrapper[4734]: E1205 23:21:28.614660 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:21:29 crc kubenswrapper[4734]: I1205 23:21:29.663045 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=10.66302033 podStartE2EDuration="10.66302033s" podCreationTimestamp="2025-12-05 23:21:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:21:29.661810011 +0000 UTC m=+110.345244127" watchObservedRunningTime="2025-12-05 23:21:29.66302033 +0000 UTC m=+110.346424606" Dec 05 23:21:30 crc kubenswrapper[4734]: I1205 23:21:30.613460 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:30 crc kubenswrapper[4734]: I1205 23:21:30.613494 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:30 crc kubenswrapper[4734]: I1205 23:21:30.613490 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:30 crc kubenswrapper[4734]: E1205 23:21:30.614549 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:21:30 crc kubenswrapper[4734]: E1205 23:21:30.614689 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:21:30 crc kubenswrapper[4734]: I1205 23:21:30.613549 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:30 crc kubenswrapper[4734]: E1205 23:21:30.614856 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:21:30 crc kubenswrapper[4734]: E1205 23:21:30.614397 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:21:32 crc kubenswrapper[4734]: I1205 23:21:32.613622 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:32 crc kubenswrapper[4734]: I1205 23:21:32.613681 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:32 crc kubenswrapper[4734]: E1205 23:21:32.613775 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:21:32 crc kubenswrapper[4734]: I1205 23:21:32.613888 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:32 crc kubenswrapper[4734]: I1205 23:21:32.614090 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:32 crc kubenswrapper[4734]: E1205 23:21:32.614080 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:21:32 crc kubenswrapper[4734]: E1205 23:21:32.614166 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:21:32 crc kubenswrapper[4734]: E1205 23:21:32.614413 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:21:34 crc kubenswrapper[4734]: I1205 23:21:34.303999 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6kmh_1d76dc4e-40f3-4457-9a99-16f9a8ca8081/kube-multus/1.log" Dec 05 23:21:34 crc kubenswrapper[4734]: I1205 23:21:34.305429 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6kmh_1d76dc4e-40f3-4457-9a99-16f9a8ca8081/kube-multus/0.log" Dec 05 23:21:34 crc kubenswrapper[4734]: I1205 23:21:34.305496 4734 generic.go:334] "Generic (PLEG): container finished" podID="1d76dc4e-40f3-4457-9a99-16f9a8ca8081" containerID="8453d43131f407bdf61410dd38713b44aea86c8647825551f40b2c41552206e8" exitCode=1 Dec 05 23:21:34 crc kubenswrapper[4734]: I1205 23:21:34.305600 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d6kmh" event={"ID":"1d76dc4e-40f3-4457-9a99-16f9a8ca8081","Type":"ContainerDied","Data":"8453d43131f407bdf61410dd38713b44aea86c8647825551f40b2c41552206e8"} Dec 05 23:21:34 crc kubenswrapper[4734]: I1205 23:21:34.305717 4734 scope.go:117] "RemoveContainer" containerID="ad9f093a04efdb5a9b3990df19604418f4d9213b08f680235a67891a0207c1a8" Dec 05 23:21:34 crc kubenswrapper[4734]: I1205 23:21:34.306380 4734 scope.go:117] "RemoveContainer" containerID="8453d43131f407bdf61410dd38713b44aea86c8647825551f40b2c41552206e8" Dec 05 23:21:34 crc kubenswrapper[4734]: E1205 23:21:34.306789 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-d6kmh_openshift-multus(1d76dc4e-40f3-4457-9a99-16f9a8ca8081)\"" pod="openshift-multus/multus-d6kmh" podUID="1d76dc4e-40f3-4457-9a99-16f9a8ca8081" Dec 05 23:21:34 crc kubenswrapper[4734]: I1205 23:21:34.613883 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:34 crc kubenswrapper[4734]: I1205 23:21:34.613883 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:34 crc kubenswrapper[4734]: I1205 23:21:34.614023 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:34 crc kubenswrapper[4734]: I1205 23:21:34.614038 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:34 crc kubenswrapper[4734]: E1205 23:21:34.614477 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:21:34 crc kubenswrapper[4734]: E1205 23:21:34.614640 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:21:34 crc kubenswrapper[4734]: E1205 23:21:34.614842 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:21:34 crc kubenswrapper[4734]: E1205 23:21:34.614494 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:21:35 crc kubenswrapper[4734]: I1205 23:21:35.311621 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6kmh_1d76dc4e-40f3-4457-9a99-16f9a8ca8081/kube-multus/1.log" Dec 05 23:21:36 crc kubenswrapper[4734]: I1205 23:21:36.613334 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:36 crc kubenswrapper[4734]: I1205 23:21:36.613334 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:36 crc kubenswrapper[4734]: I1205 23:21:36.613348 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:36 crc kubenswrapper[4734]: I1205 23:21:36.613354 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:36 crc kubenswrapper[4734]: E1205 23:21:36.614808 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:21:36 crc kubenswrapper[4734]: E1205 23:21:36.614635 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:21:36 crc kubenswrapper[4734]: E1205 23:21:36.614875 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:21:36 crc kubenswrapper[4734]: E1205 23:21:36.614432 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:21:38 crc kubenswrapper[4734]: I1205 23:21:38.613147 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:38 crc kubenswrapper[4734]: I1205 23:21:38.613367 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:38 crc kubenswrapper[4734]: I1205 23:21:38.613578 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:38 crc kubenswrapper[4734]: E1205 23:21:38.613793 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:21:38 crc kubenswrapper[4734]: I1205 23:21:38.613875 4734 scope.go:117] "RemoveContainer" containerID="9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549" Dec 05 23:21:38 crc kubenswrapper[4734]: I1205 23:21:38.614060 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:38 crc kubenswrapper[4734]: E1205 23:21:38.614087 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8bfg7_openshift-ovn-kubernetes(2927a376-2f69-4820-a222-b86f08ece55a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" podUID="2927a376-2f69-4820-a222-b86f08ece55a" Dec 05 23:21:38 crc kubenswrapper[4734]: E1205 23:21:38.614239 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:21:38 crc kubenswrapper[4734]: E1205 23:21:38.614375 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:21:38 crc kubenswrapper[4734]: E1205 23:21:38.614505 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:21:39 crc kubenswrapper[4734]: E1205 23:21:39.619436 4734 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 05 23:21:39 crc kubenswrapper[4734]: E1205 23:21:39.704832 4734 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 23:21:40 crc kubenswrapper[4734]: I1205 23:21:40.613366 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:40 crc kubenswrapper[4734]: I1205 23:21:40.613455 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:40 crc kubenswrapper[4734]: E1205 23:21:40.613606 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:21:40 crc kubenswrapper[4734]: I1205 23:21:40.613629 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:40 crc kubenswrapper[4734]: I1205 23:21:40.613704 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:40 crc kubenswrapper[4734]: E1205 23:21:40.613896 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:21:40 crc kubenswrapper[4734]: E1205 23:21:40.614053 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:21:40 crc kubenswrapper[4734]: E1205 23:21:40.614334 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:21:42 crc kubenswrapper[4734]: I1205 23:21:42.613716 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:42 crc kubenswrapper[4734]: I1205 23:21:42.613748 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:42 crc kubenswrapper[4734]: I1205 23:21:42.613826 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:42 crc kubenswrapper[4734]: E1205 23:21:42.613890 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:21:42 crc kubenswrapper[4734]: I1205 23:21:42.613926 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:42 crc kubenswrapper[4734]: E1205 23:21:42.614115 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:21:42 crc kubenswrapper[4734]: E1205 23:21:42.614589 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:21:42 crc kubenswrapper[4734]: E1205 23:21:42.614323 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:21:44 crc kubenswrapper[4734]: I1205 23:21:44.613338 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:44 crc kubenswrapper[4734]: I1205 23:21:44.613584 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:44 crc kubenswrapper[4734]: E1205 23:21:44.613737 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:21:44 crc kubenswrapper[4734]: I1205 23:21:44.613798 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:44 crc kubenswrapper[4734]: I1205 23:21:44.613847 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:44 crc kubenswrapper[4734]: E1205 23:21:44.613981 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:21:44 crc kubenswrapper[4734]: E1205 23:21:44.614194 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:21:44 crc kubenswrapper[4734]: E1205 23:21:44.614338 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:21:44 crc kubenswrapper[4734]: E1205 23:21:44.706903 4734 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 23:21:46 crc kubenswrapper[4734]: I1205 23:21:46.613846 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:46 crc kubenswrapper[4734]: I1205 23:21:46.613948 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:46 crc kubenswrapper[4734]: I1205 23:21:46.614033 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:46 crc kubenswrapper[4734]: E1205 23:21:46.614193 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:21:46 crc kubenswrapper[4734]: I1205 23:21:46.614318 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:46 crc kubenswrapper[4734]: E1205 23:21:46.614428 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:21:46 crc kubenswrapper[4734]: E1205 23:21:46.614596 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:21:46 crc kubenswrapper[4734]: E1205 23:21:46.614730 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:21:46 crc kubenswrapper[4734]: I1205 23:21:46.615302 4734 scope.go:117] "RemoveContainer" containerID="8453d43131f407bdf61410dd38713b44aea86c8647825551f40b2c41552206e8" Dec 05 23:21:47 crc kubenswrapper[4734]: I1205 23:21:47.366660 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6kmh_1d76dc4e-40f3-4457-9a99-16f9a8ca8081/kube-multus/1.log" Dec 05 23:21:47 crc kubenswrapper[4734]: I1205 23:21:47.366743 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d6kmh" event={"ID":"1d76dc4e-40f3-4457-9a99-16f9a8ca8081","Type":"ContainerStarted","Data":"a8a1ca8b179a33db1ca18703b7ff293739d406b155da94b438e9d16f215c6bb4"} Dec 05 23:21:48 crc kubenswrapper[4734]: I1205 23:21:48.613908 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:48 crc kubenswrapper[4734]: I1205 23:21:48.614089 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:48 crc kubenswrapper[4734]: E1205 23:21:48.614130 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:21:48 crc kubenswrapper[4734]: I1205 23:21:48.614157 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:48 crc kubenswrapper[4734]: E1205 23:21:48.614376 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:21:48 crc kubenswrapper[4734]: E1205 23:21:48.614690 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:21:48 crc kubenswrapper[4734]: I1205 23:21:48.615032 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:48 crc kubenswrapper[4734]: E1205 23:21:48.615218 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:21:49 crc kubenswrapper[4734]: I1205 23:21:49.616119 4734 scope.go:117] "RemoveContainer" containerID="9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549" Dec 05 23:21:49 crc kubenswrapper[4734]: E1205 23:21:49.707488 4734 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 23:21:50 crc kubenswrapper[4734]: I1205 23:21:50.381700 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8bfg7_2927a376-2f69-4820-a222-b86f08ece55a/ovnkube-controller/3.log" Dec 05 23:21:50 crc kubenswrapper[4734]: I1205 23:21:50.385389 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" event={"ID":"2927a376-2f69-4820-a222-b86f08ece55a","Type":"ContainerStarted","Data":"3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80"} Dec 05 23:21:50 crc kubenswrapper[4734]: I1205 23:21:50.386030 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:21:50 crc kubenswrapper[4734]: I1205 23:21:50.423171 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" podStartSLOduration=110.423137869 podStartE2EDuration="1m50.423137869s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:21:50.420328122 +0000 UTC m=+131.103732408" watchObservedRunningTime="2025-12-05 23:21:50.423137869 +0000 UTC m=+131.106542185" Dec 05 23:21:50 crc kubenswrapper[4734]: I1205 23:21:50.540024 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-l6r6g"] Dec 05 23:21:50 crc kubenswrapper[4734]: I1205 23:21:50.540251 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:50 crc kubenswrapper[4734]: E1205 23:21:50.540492 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:21:50 crc kubenswrapper[4734]: I1205 23:21:50.613903 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:50 crc kubenswrapper[4734]: I1205 23:21:50.613965 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:50 crc kubenswrapper[4734]: E1205 23:21:50.614180 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:21:50 crc kubenswrapper[4734]: E1205 23:21:50.614327 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:21:50 crc kubenswrapper[4734]: I1205 23:21:50.614541 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:50 crc kubenswrapper[4734]: E1205 23:21:50.614695 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:21:52 crc kubenswrapper[4734]: I1205 23:21:52.613816 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:52 crc kubenswrapper[4734]: E1205 23:21:52.614189 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:21:52 crc kubenswrapper[4734]: I1205 23:21:52.613905 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:52 crc kubenswrapper[4734]: E1205 23:21:52.614285 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:21:52 crc kubenswrapper[4734]: I1205 23:21:52.613817 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:52 crc kubenswrapper[4734]: I1205 23:21:52.613901 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:52 crc kubenswrapper[4734]: E1205 23:21:52.614352 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:21:52 crc kubenswrapper[4734]: E1205 23:21:52.614613 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:21:54 crc kubenswrapper[4734]: I1205 23:21:54.613223 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:54 crc kubenswrapper[4734]: I1205 23:21:54.613284 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:54 crc kubenswrapper[4734]: I1205 23:21:54.613312 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:54 crc kubenswrapper[4734]: I1205 23:21:54.613251 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:54 crc kubenswrapper[4734]: E1205 23:21:54.613500 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 23:21:54 crc kubenswrapper[4734]: E1205 23:21:54.613636 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 23:21:54 crc kubenswrapper[4734]: E1205 23:21:54.613729 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6r6g" podUID="641af4fe-dd54-4118-8985-d37a03d64f79" Dec 05 23:21:54 crc kubenswrapper[4734]: E1205 23:21:54.613858 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.124449 4734 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.173128 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-j6jsf"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.174402 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-j6jsf" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.175637 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.176166 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.176721 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.176734 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fsrfk"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.176830 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.177755 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fsrfk" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.179172 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-776fp"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.179533 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.180088 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-776fp" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.180853 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gxdpj"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.181207 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-xsgsc"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.181641 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsgsc" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.182007 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.187106 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.189439 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gkww2"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.190369 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-gkww2" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.192190 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.192385 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.192402 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.192531 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.192583 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.192666 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.192770 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.192812 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.192907 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.192950 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.192941 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.193050 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.193225 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.193289 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.193456 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.193521 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.193469 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.193725 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.194722 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.194903 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.195065 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.196620 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-htwjw"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.196678 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.197530 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.202035 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/74a8397f-0607-4761-9fc5-77e9a6d197c8-images\") pod \"machine-api-operator-5694c8668f-j6jsf\" (UID: \"74a8397f-0607-4761-9fc5-77e9a6d197c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j6jsf" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.202101 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/376d1200-a143-4f81-9399-41fdafd1f0b1-config\") pod \"controller-manager-879f6c89f-fsrfk\" (UID: \"376d1200-a143-4f81-9399-41fdafd1f0b1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsrfk" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.202134 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/07e87699-6af3-4f68-a3c8-85780433774b-etcd-client\") pod \"apiserver-7bbb656c7d-54qxz\" (UID: \"07e87699-6af3-4f68-a3c8-85780433774b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.202195 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/07e87699-6af3-4f68-a3c8-85780433774b-audit-dir\") pod \"apiserver-7bbb656c7d-54qxz\" (UID: \"07e87699-6af3-4f68-a3c8-85780433774b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.202219 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86bfea27-2a17-463f-9768-201c49599d74-config\") pod \"authentication-operator-69f744f599-776fp\" (UID: \"86bfea27-2a17-463f-9768-201c49599d74\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-776fp" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.202281 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/07e87699-6af3-4f68-a3c8-85780433774b-encryption-config\") pod \"apiserver-7bbb656c7d-54qxz\" (UID: \"07e87699-6af3-4f68-a3c8-85780433774b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.202313 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86bfea27-2a17-463f-9768-201c49599d74-service-ca-bundle\") pod \"authentication-operator-69f744f599-776fp\" (UID: \"86bfea27-2a17-463f-9768-201c49599d74\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-776fp" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.202342 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8kkk\" (UniqueName: \"kubernetes.io/projected/376d1200-a143-4f81-9399-41fdafd1f0b1-kube-api-access-q8kkk\") pod \"controller-manager-879f6c89f-fsrfk\" (UID: \"376d1200-a143-4f81-9399-41fdafd1f0b1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsrfk" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.202374 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7ff8265-c71c-4b81-a8db-3b68a2118fd6-serving-cert\") pod \"route-controller-manager-6576b87f9c-x5rwz\" (UID: \"b7ff8265-c71c-4b81-a8db-3b68a2118fd6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.202399 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/376d1200-a143-4f81-9399-41fdafd1f0b1-serving-cert\") pod \"controller-manager-879f6c89f-fsrfk\" (UID: \"376d1200-a143-4f81-9399-41fdafd1f0b1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsrfk" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.202429 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07e87699-6af3-4f68-a3c8-85780433774b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-54qxz\" (UID: \"07e87699-6af3-4f68-a3c8-85780433774b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.202476 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/74a8397f-0607-4761-9fc5-77e9a6d197c8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-j6jsf\" (UID: \"74a8397f-0607-4761-9fc5-77e9a6d197c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j6jsf" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.202514 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znrjn\" (UniqueName: \"kubernetes.io/projected/74a8397f-0607-4761-9fc5-77e9a6d197c8-kube-api-access-znrjn\") pod \"machine-api-operator-5694c8668f-j6jsf\" (UID: \"74a8397f-0607-4761-9fc5-77e9a6d197c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j6jsf" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.202578 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/376d1200-a143-4f81-9399-41fdafd1f0b1-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fsrfk\" (UID: \"376d1200-a143-4f81-9399-41fdafd1f0b1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsrfk" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.202605 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/07e87699-6af3-4f68-a3c8-85780433774b-audit-policies\") pod \"apiserver-7bbb656c7d-54qxz\" (UID: \"07e87699-6af3-4f68-a3c8-85780433774b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.202680 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b7ff8265-c71c-4b81-a8db-3b68a2118fd6-client-ca\") pod \"route-controller-manager-6576b87f9c-x5rwz\" (UID: \"b7ff8265-c71c-4b81-a8db-3b68a2118fd6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.202727 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/376d1200-a143-4f81-9399-41fdafd1f0b1-client-ca\") pod \"controller-manager-879f6c89f-fsrfk\" (UID: \"376d1200-a143-4f81-9399-41fdafd1f0b1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsrfk" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.202757 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86bfea27-2a17-463f-9768-201c49599d74-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-776fp\" (UID: \"86bfea27-2a17-463f-9768-201c49599d74\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-776fp" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.202792 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07e87699-6af3-4f68-a3c8-85780433774b-serving-cert\") pod \"apiserver-7bbb656c7d-54qxz\" (UID: \"07e87699-6af3-4f68-a3c8-85780433774b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.202818 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl6ld\" (UniqueName: \"kubernetes.io/projected/b7ff8265-c71c-4b81-a8db-3b68a2118fd6-kube-api-access-tl6ld\") pod \"route-controller-manager-6576b87f9c-x5rwz\" (UID: \"b7ff8265-c71c-4b81-a8db-3b68a2118fd6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.202846 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f9g7\" (UniqueName: \"kubernetes.io/projected/86bfea27-2a17-463f-9768-201c49599d74-kube-api-access-7f9g7\") pod \"authentication-operator-69f744f599-776fp\" (UID: \"86bfea27-2a17-463f-9768-201c49599d74\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-776fp" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.202923 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7ff8265-c71c-4b81-a8db-3b68a2118fd6-config\") pod \"route-controller-manager-6576b87f9c-x5rwz\" (UID: \"b7ff8265-c71c-4b81-a8db-3b68a2118fd6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.202944 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/07e87699-6af3-4f68-a3c8-85780433774b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-54qxz\" (UID: \"07e87699-6af3-4f68-a3c8-85780433774b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.202963 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6jw7\" (UniqueName: \"kubernetes.io/projected/07e87699-6af3-4f68-a3c8-85780433774b-kube-api-access-r6jw7\") pod \"apiserver-7bbb656c7d-54qxz\" (UID: \"07e87699-6af3-4f68-a3c8-85780433774b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.203005 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86bfea27-2a17-463f-9768-201c49599d74-serving-cert\") pod \"authentication-operator-69f744f599-776fp\" (UID: \"86bfea27-2a17-463f-9768-201c49599d74\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-776fp" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.203109 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74a8397f-0607-4761-9fc5-77e9a6d197c8-config\") pod \"machine-api-operator-5694c8668f-j6jsf\" (UID: \"74a8397f-0607-4761-9fc5-77e9a6d197c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j6jsf" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.203123 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.204018 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ptxqw"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.204668 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.205457 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-htwjw" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.206093 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-txdvl"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.206658 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.206997 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-txdvl" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.207137 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.207385 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wl2fs"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.227946 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wl2fs" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.229882 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k9fv6"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.231487 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k9fv6" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.234640 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k9rd"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.236124 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k9rd" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.239380 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.240036 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.240455 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.240841 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.241171 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.241315 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.241633 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.241697 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.241937 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.242144 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.242298 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.242499 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.242524 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.242956 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.243256 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.243358 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.243433 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.243490 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.243702 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.243819 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.243349 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.244001 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.244093 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.244416 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.244644 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.244757 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.244850 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.243534 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.245486 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.247123 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.252108 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.265657 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.266757 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-5h9wr"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.267254 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.267637 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.267975 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.268087 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.268121 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.271416 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44nn9"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.272106 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2dx28"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.272489 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2dx28" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.272510 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44nn9" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.273647 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-x67qn"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.274563 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.281584 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.281854 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.282661 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.285808 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.285986 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.286111 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.286362 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.287217 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.287401 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.290755 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.290918 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.291125 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.292037 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.292143 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.292630 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zwcnb"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.293378 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zwcnb" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.293452 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-svcq4"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.294302 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-svcq4" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.297184 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.297418 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-m6c2b"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.298270 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-m6c2b" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.303509 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.303891 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.304183 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.317222 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qqvvd"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.320522 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86bfea27-2a17-463f-9768-201c49599d74-serving-cert\") pod \"authentication-operator-69f744f599-776fp\" (UID: \"86bfea27-2a17-463f-9768-201c49599d74\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-776fp" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.320606 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-console-config\") pod \"console-f9d7485db-5h9wr\" (UID: \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\") " pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.320673 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn9xj\" (UniqueName: \"kubernetes.io/projected/99827392-eef9-4b43-ab05-d57f8bc8d3ef-kube-api-access-xn9xj\") pod \"cluster-samples-operator-665b6dd947-wl2fs\" (UID: \"99827392-eef9-4b43-ab05-d57f8bc8d3ef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wl2fs" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.320712 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74a8397f-0607-4761-9fc5-77e9a6d197c8-config\") pod \"machine-api-operator-5694c8668f-j6jsf\" (UID: \"74a8397f-0607-4761-9fc5-77e9a6d197c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j6jsf" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.320747 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf8j9\" (UniqueName: \"kubernetes.io/projected/30f5e37c-8b52-4347-bc5e-a973ca06a7bf-kube-api-access-bf8j9\") pod \"console-operator-58897d9998-gkww2\" (UID: \"30f5e37c-8b52-4347-bc5e-a973ca06a7bf\") " pod="openshift-console-operator/console-operator-58897d9998-gkww2" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.320834 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvcxp\" (UniqueName: \"kubernetes.io/projected/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-kube-api-access-qvcxp\") pod \"console-f9d7485db-5h9wr\" (UID: \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\") " pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.320892 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/99827392-eef9-4b43-ab05-d57f8bc8d3ef-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wl2fs\" (UID: \"99827392-eef9-4b43-ab05-d57f8bc8d3ef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wl2fs" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.321169 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0152808-f0b7-4ce4-9bc1-6bc11e69bd7f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2dx28\" (UID: \"a0152808-f0b7-4ce4-9bc1-6bc11e69bd7f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2dx28" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.321200 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7adf273b-63fb-40fe-9d0d-fe467260565b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.321230 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-console-oauth-config\") pod \"console-f9d7485db-5h9wr\" (UID: \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\") " pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.321255 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.321300 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.321598 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4xnw\" (UniqueName: \"kubernetes.io/projected/448e552f-8a25-469f-b959-4fbe91ae9035-kube-api-access-m4xnw\") pod \"openshift-apiserver-operator-796bbdcf4f-44nn9\" (UID: \"448e552f-8a25-469f-b959-4fbe91ae9035\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44nn9" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.321630 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/74a8397f-0607-4761-9fc5-77e9a6d197c8-images\") pod \"machine-api-operator-5694c8668f-j6jsf\" (UID: \"74a8397f-0607-4761-9fc5-77e9a6d197c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j6jsf" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.321655 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/776e53fa-bf9e-44c4-8f89-2f78059733a7-serving-cert\") pod \"openshift-config-operator-7777fb866f-txdvl\" (UID: \"776e53fa-bf9e-44c4-8f89-2f78059733a7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-txdvl" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.321681 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/376d1200-a143-4f81-9399-41fdafd1f0b1-config\") pod \"controller-manager-879f6c89f-fsrfk\" (UID: \"376d1200-a143-4f81-9399-41fdafd1f0b1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsrfk" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.321706 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/07e87699-6af3-4f68-a3c8-85780433774b-etcd-client\") pod \"apiserver-7bbb656c7d-54qxz\" (UID: \"07e87699-6af3-4f68-a3c8-85780433774b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.321885 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/07e87699-6af3-4f68-a3c8-85780433774b-audit-dir\") pod \"apiserver-7bbb656c7d-54qxz\" (UID: \"07e87699-6af3-4f68-a3c8-85780433774b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.321914 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86bfea27-2a17-463f-9768-201c49599d74-config\") pod \"authentication-operator-69f744f599-776fp\" (UID: \"86bfea27-2a17-463f-9768-201c49599d74\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-776fp" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.321943 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mhsf\" (UniqueName: \"kubernetes.io/projected/4520b844-1a95-4300-8a10-5ef68e2067cb-kube-api-access-2mhsf\") pod \"machine-approver-56656f9798-xsgsc\" (UID: \"4520b844-1a95-4300-8a10-5ef68e2067cb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsgsc" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.321975 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-service-ca\") pod \"console-f9d7485db-5h9wr\" (UID: \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\") " pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.322022 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.322208 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/776e53fa-bf9e-44c4-8f89-2f78059733a7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-txdvl\" (UID: \"776e53fa-bf9e-44c4-8f89-2f78059733a7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-txdvl" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.322228 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0152808-f0b7-4ce4-9bc1-6bc11e69bd7f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2dx28\" (UID: \"a0152808-f0b7-4ce4-9bc1-6bc11e69bd7f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2dx28" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.322284 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7adf273b-63fb-40fe-9d0d-fe467260565b-etcd-serving-ca\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.322311 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcclz\" (UniqueName: \"kubernetes.io/projected/547872c0-29db-43b6-a531-14610127080d-kube-api-access-rcclz\") pod \"cluster-image-registry-operator-dc59b4c8b-4k9rd\" (UID: \"547872c0-29db-43b6-a531-14610127080d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k9rd" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.322579 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/07e87699-6af3-4f68-a3c8-85780433774b-encryption-config\") pod \"apiserver-7bbb656c7d-54qxz\" (UID: \"07e87699-6af3-4f68-a3c8-85780433774b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.322618 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86bfea27-2a17-463f-9768-201c49599d74-service-ca-bundle\") pod \"authentication-operator-69f744f599-776fp\" (UID: \"86bfea27-2a17-463f-9768-201c49599d74\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-776fp" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.322641 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30f5e37c-8b52-4347-bc5e-a973ca06a7bf-config\") pod \"console-operator-58897d9998-gkww2\" (UID: \"30f5e37c-8b52-4347-bc5e-a973ca06a7bf\") " pod="openshift-console-operator/console-operator-58897d9998-gkww2" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.322665 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.322688 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4520b844-1a95-4300-8a10-5ef68e2067cb-machine-approver-tls\") pod \"machine-approver-56656f9798-xsgsc\" (UID: \"4520b844-1a95-4300-8a10-5ef68e2067cb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsgsc" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.322952 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8kkk\" (UniqueName: \"kubernetes.io/projected/376d1200-a143-4f81-9399-41fdafd1f0b1-kube-api-access-q8kkk\") pod \"controller-manager-879f6c89f-fsrfk\" (UID: \"376d1200-a143-4f81-9399-41fdafd1f0b1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsrfk" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.322985 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30f5e37c-8b52-4347-bc5e-a973ca06a7bf-serving-cert\") pod \"console-operator-58897d9998-gkww2\" (UID: \"30f5e37c-8b52-4347-bc5e-a973ca06a7bf\") " pod="openshift-console-operator/console-operator-58897d9998-gkww2" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.323007 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7adf273b-63fb-40fe-9d0d-fe467260565b-serving-cert\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.323036 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-audit-policies\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.323062 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4520b844-1a95-4300-8a10-5ef68e2067cb-config\") pod \"machine-approver-56656f9798-xsgsc\" (UID: \"4520b844-1a95-4300-8a10-5ef68e2067cb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsgsc" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.323087 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7ff8265-c71c-4b81-a8db-3b68a2118fd6-serving-cert\") pod \"route-controller-manager-6576b87f9c-x5rwz\" (UID: \"b7ff8265-c71c-4b81-a8db-3b68a2118fd6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.323329 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.323361 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07e87699-6af3-4f68-a3c8-85780433774b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-54qxz\" (UID: \"07e87699-6af3-4f68-a3c8-85780433774b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.323384 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7adf273b-63fb-40fe-9d0d-fe467260565b-image-import-ca\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.323408 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7adf273b-63fb-40fe-9d0d-fe467260565b-encryption-config\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.323436 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.324095 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.327545 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/376d1200-a143-4f81-9399-41fdafd1f0b1-serving-cert\") pod \"controller-manager-879f6c89f-fsrfk\" (UID: \"376d1200-a143-4f81-9399-41fdafd1f0b1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsrfk" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.327597 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7adf273b-63fb-40fe-9d0d-fe467260565b-config\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.327629 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/74a8397f-0607-4761-9fc5-77e9a6d197c8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-j6jsf\" (UID: \"74a8397f-0607-4761-9fc5-77e9a6d197c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j6jsf" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.327673 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-trusted-ca-bundle\") pod \"console-f9d7485db-5h9wr\" (UID: \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\") " pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.327699 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-oauth-serving-cert\") pod \"console-f9d7485db-5h9wr\" (UID: \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\") " pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.327863 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.328717 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.329642 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6zcx\" (UniqueName: \"kubernetes.io/projected/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-kube-api-access-t6zcx\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.329688 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbtkr\" (UniqueName: \"kubernetes.io/projected/7adf273b-63fb-40fe-9d0d-fe467260565b-kube-api-access-vbtkr\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.329711 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.329743 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/547872c0-29db-43b6-a531-14610127080d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4k9rd\" (UID: \"547872c0-29db-43b6-a531-14610127080d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k9rd" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.329805 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znrjn\" (UniqueName: \"kubernetes.io/projected/74a8397f-0607-4761-9fc5-77e9a6d197c8-kube-api-access-znrjn\") pod \"machine-api-operator-5694c8668f-j6jsf\" (UID: \"74a8397f-0607-4761-9fc5-77e9a6d197c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j6jsf" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.329885 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/376d1200-a143-4f81-9399-41fdafd1f0b1-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fsrfk\" (UID: \"376d1200-a143-4f81-9399-41fdafd1f0b1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsrfk" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.329929 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/07e87699-6af3-4f68-a3c8-85780433774b-audit-policies\") pod \"apiserver-7bbb656c7d-54qxz\" (UID: \"07e87699-6af3-4f68-a3c8-85780433774b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.329960 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/547872c0-29db-43b6-a531-14610127080d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4k9rd\" (UID: \"547872c0-29db-43b6-a531-14610127080d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k9rd" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.330019 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.330046 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0152808-f0b7-4ce4-9bc1-6bc11e69bd7f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2dx28\" (UID: \"a0152808-f0b7-4ce4-9bc1-6bc11e69bd7f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2dx28" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.330081 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/376d1200-a143-4f81-9399-41fdafd1f0b1-client-ca\") pod \"controller-manager-879f6c89f-fsrfk\" (UID: \"376d1200-a143-4f81-9399-41fdafd1f0b1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsrfk" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.330108 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86bfea27-2a17-463f-9768-201c49599d74-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-776fp\" (UID: \"86bfea27-2a17-463f-9768-201c49599d74\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-776fp" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.330133 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.330142 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kkt7h"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.330274 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b7ff8265-c71c-4b81-a8db-3b68a2118fd6-client-ca\") pod \"route-controller-manager-6576b87f9c-x5rwz\" (UID: \"b7ff8265-c71c-4b81-a8db-3b68a2118fd6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.330305 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07e87699-6af3-4f68-a3c8-85780433774b-serving-cert\") pod \"apiserver-7bbb656c7d-54qxz\" (UID: \"07e87699-6af3-4f68-a3c8-85780433774b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.330333 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/448e552f-8a25-469f-b959-4fbe91ae9035-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-44nn9\" (UID: \"448e552f-8a25-469f-b959-4fbe91ae9035\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44nn9" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.330362 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/448e552f-8a25-469f-b959-4fbe91ae9035-config\") pod \"openshift-apiserver-operator-796bbdcf4f-44nn9\" (UID: \"448e552f-8a25-469f-b959-4fbe91ae9035\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44nn9" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.331451 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/74a8397f-0607-4761-9fc5-77e9a6d197c8-images\") pod \"machine-api-operator-5694c8668f-j6jsf\" (UID: \"74a8397f-0607-4761-9fc5-77e9a6d197c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j6jsf" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.332010 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74a8397f-0607-4761-9fc5-77e9a6d197c8-config\") pod \"machine-api-operator-5694c8668f-j6jsf\" (UID: \"74a8397f-0607-4761-9fc5-77e9a6d197c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j6jsf" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.334253 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qmqbr"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.336479 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07e87699-6af3-4f68-a3c8-85780433774b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-54qxz\" (UID: \"07e87699-6af3-4f68-a3c8-85780433774b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.337744 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl6ld\" (UniqueName: \"kubernetes.io/projected/b7ff8265-c71c-4b81-a8db-3b68a2118fd6-kube-api-access-tl6ld\") pod \"route-controller-manager-6576b87f9c-x5rwz\" (UID: \"b7ff8265-c71c-4b81-a8db-3b68a2118fd6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.337794 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7adf273b-63fb-40fe-9d0d-fe467260565b-node-pullsecrets\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.337822 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7adf273b-63fb-40fe-9d0d-fe467260565b-etcd-client\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.337853 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4520b844-1a95-4300-8a10-5ef68e2067cb-auth-proxy-config\") pod \"machine-approver-56656f9798-xsgsc\" (UID: \"4520b844-1a95-4300-8a10-5ef68e2067cb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsgsc" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.338197 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.338372 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.338591 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.339215 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qmqbr" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.339430 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-2djpb"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.339552 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kkt7h" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.339646 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/07e87699-6af3-4f68-a3c8-85780433774b-audit-dir\") pod \"apiserver-7bbb656c7d-54qxz\" (UID: \"07e87699-6af3-4f68-a3c8-85780433774b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.340874 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b7ff8265-c71c-4b81-a8db-3b68a2118fd6-client-ca\") pod \"route-controller-manager-6576b87f9c-x5rwz\" (UID: \"b7ff8265-c71c-4b81-a8db-3b68a2118fd6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.338052 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjbll\" (UniqueName: \"kubernetes.io/projected/776e53fa-bf9e-44c4-8f89-2f78059733a7-kube-api-access-vjbll\") pod \"openshift-config-operator-7777fb866f-txdvl\" (UID: \"776e53fa-bf9e-44c4-8f89-2f78059733a7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-txdvl" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.341074 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f9g7\" (UniqueName: \"kubernetes.io/projected/86bfea27-2a17-463f-9768-201c49599d74-kube-api-access-7f9g7\") pod \"authentication-operator-69f744f599-776fp\" (UID: \"86bfea27-2a17-463f-9768-201c49599d74\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-776fp" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.341155 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qqvvd" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.345109 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86bfea27-2a17-463f-9768-201c49599d74-service-ca-bundle\") pod \"authentication-operator-69f744f599-776fp\" (UID: \"86bfea27-2a17-463f-9768-201c49599d74\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-776fp" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.341714 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ldsd\" (UniqueName: \"kubernetes.io/projected/d391f1fa-9bbe-478c-a1da-6ccb8f75f3c5-kube-api-access-6ldsd\") pod \"downloads-7954f5f757-htwjw\" (UID: \"d391f1fa-9bbe-478c-a1da-6ccb8f75f3c5\") " pod="openshift-console/downloads-7954f5f757-htwjw" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.346028 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-audit-dir\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.346490 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/07e87699-6af3-4f68-a3c8-85780433774b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-54qxz\" (UID: \"07e87699-6af3-4f68-a3c8-85780433774b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.346588 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6jw7\" (UniqueName: \"kubernetes.io/projected/07e87699-6af3-4f68-a3c8-85780433774b-kube-api-access-r6jw7\") pod \"apiserver-7bbb656c7d-54qxz\" (UID: \"07e87699-6af3-4f68-a3c8-85780433774b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.346624 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/30f5e37c-8b52-4347-bc5e-a973ca06a7bf-trusted-ca\") pod \"console-operator-58897d9998-gkww2\" (UID: \"30f5e37c-8b52-4347-bc5e-a973ca06a7bf\") " pod="openshift-console-operator/console-operator-58897d9998-gkww2" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.346711 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.346712 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.346865 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/376d1200-a143-4f81-9399-41fdafd1f0b1-config\") pod \"controller-manager-879f6c89f-fsrfk\" (UID: \"376d1200-a143-4f81-9399-41fdafd1f0b1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsrfk" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.347475 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7ff8265-c71c-4b81-a8db-3b68a2118fd6-config\") pod \"route-controller-manager-6576b87f9c-x5rwz\" (UID: \"b7ff8265-c71c-4b81-a8db-3b68a2118fd6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.347626 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/376d1200-a143-4f81-9399-41fdafd1f0b1-serving-cert\") pod \"controller-manager-879f6c89f-fsrfk\" (UID: \"376d1200-a143-4f81-9399-41fdafd1f0b1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsrfk" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.353656 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-console-serving-cert\") pod \"console-f9d7485db-5h9wr\" (UID: \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\") " pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.353939 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07e87699-6af3-4f68-a3c8-85780433774b-serving-cert\") pod \"apiserver-7bbb656c7d-54qxz\" (UID: \"07e87699-6af3-4f68-a3c8-85780433774b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.356585 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7ff8265-c71c-4b81-a8db-3b68a2118fd6-serving-cert\") pod \"route-controller-manager-6576b87f9c-x5rwz\" (UID: \"b7ff8265-c71c-4b81-a8db-3b68a2118fd6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.357723 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.358324 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/07e87699-6af3-4f68-a3c8-85780433774b-encryption-config\") pod \"apiserver-7bbb656c7d-54qxz\" (UID: \"07e87699-6af3-4f68-a3c8-85780433774b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.351677 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/74a8397f-0607-4761-9fc5-77e9a6d197c8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-j6jsf\" (UID: \"74a8397f-0607-4761-9fc5-77e9a6d197c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j6jsf" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.359499 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/376d1200-a143-4f81-9399-41fdafd1f0b1-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fsrfk\" (UID: \"376d1200-a143-4f81-9399-41fdafd1f0b1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsrfk" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.359580 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86bfea27-2a17-463f-9768-201c49599d74-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-776fp\" (UID: \"86bfea27-2a17-463f-9768-201c49599d74\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-776fp" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.359657 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.360020 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/07e87699-6af3-4f68-a3c8-85780433774b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-54qxz\" (UID: \"07e87699-6af3-4f68-a3c8-85780433774b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.360375 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lr2g"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.360658 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/376d1200-a143-4f81-9399-41fdafd1f0b1-client-ca\") pod \"controller-manager-879f6c89f-fsrfk\" (UID: \"376d1200-a143-4f81-9399-41fdafd1f0b1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsrfk" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.361034 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86bfea27-2a17-463f-9768-201c49599d74-serving-cert\") pod \"authentication-operator-69f744f599-776fp\" (UID: \"86bfea27-2a17-463f-9768-201c49599d74\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-776fp" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.361043 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7adf273b-63fb-40fe-9d0d-fe467260565b-audit\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.361126 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/547872c0-29db-43b6-a531-14610127080d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4k9rd\" (UID: \"547872c0-29db-43b6-a531-14610127080d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k9rd" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.361165 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.361190 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7adf273b-63fb-40fe-9d0d-fe467260565b-audit-dir\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.361387 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-2djpb" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.361418 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lr2g" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.361477 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86bfea27-2a17-463f-9768-201c49599d74-config\") pod \"authentication-operator-69f744f599-776fp\" (UID: \"86bfea27-2a17-463f-9768-201c49599d74\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-776fp" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.362502 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/07e87699-6af3-4f68-a3c8-85780433774b-audit-policies\") pod \"apiserver-7bbb656c7d-54qxz\" (UID: \"07e87699-6af3-4f68-a3c8-85780433774b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.362605 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7ff8265-c71c-4b81-a8db-3b68a2118fd6-config\") pod \"route-controller-manager-6576b87f9c-x5rwz\" (UID: \"b7ff8265-c71c-4b81-a8db-3b68a2118fd6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.362715 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.362806 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.362848 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.362949 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.363061 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.363160 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.363242 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.363262 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.363371 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.363500 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.363954 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.364497 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wmxp"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.365211 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wmxp" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.365477 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-d9vxl"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.365492 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/07e87699-6af3-4f68-a3c8-85780433774b-etcd-client\") pod \"apiserver-7bbb656c7d-54qxz\" (UID: \"07e87699-6af3-4f68-a3c8-85780433774b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.366578 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9vxl" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.369802 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.370058 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.370147 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-n9fhz"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.370295 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.370877 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jlxs"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.371563 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jlxs" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.371583 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-n9fhz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.372422 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.373040 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.373749 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ws6qt"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.374429 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ws6qt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.375411 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.377360 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.377641 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dqz9r"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.378515 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dqz9r" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.378813 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrqvs"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.379176 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrqvs" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.379563 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qvhvd"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.380492 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qvhvd" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.380797 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ckxzz"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.381452 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ckxzz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.381791 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ds2cw"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.382348 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-ds2cw" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.382845 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416275-mfkvp"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.383317 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-mfkvp" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.384063 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tsb97"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.384803 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tsb97" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.385245 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-w5bmh"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.386022 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-w5bmh" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.386617 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-j6jsf"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.389847 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-htwjw"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.391281 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-776fp"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.396527 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gxdpj"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.396883 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.399759 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qqvvd"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.402403 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-n9fhz"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.403513 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-txdvl"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.405660 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fsrfk"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.419165 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.418577 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wl2fs"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.422824 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44nn9"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.423290 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-m6c2b"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.425131 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ptxqw"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.426393 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k9rd"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.427834 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.429082 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-svcq4"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.430457 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k9fv6"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.432848 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ws6qt"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.434475 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zwcnb"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.436777 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jlxs"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.437758 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.438963 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qmqbr"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.440757 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2dx28"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.442055 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wmxp"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.443150 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ckxzz"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.444266 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-5h9wr"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.445342 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4htrx"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.446481 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-mghs5"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.446773 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4htrx" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.447083 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-mghs5" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.448213 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kkt7h"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.448569 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gkww2"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.450102 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-d9vxl"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.451498 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lr2g"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.453368 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-x67qn"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.453481 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tsb97"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.456107 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qvhvd"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.457704 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4htrx"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.459132 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dqz9r"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.460619 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrqvs"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.462398 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.462478 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4xnw\" (UniqueName: \"kubernetes.io/projected/448e552f-8a25-469f-b959-4fbe91ae9035-kube-api-access-m4xnw\") pod \"openshift-apiserver-operator-796bbdcf4f-44nn9\" (UID: \"448e552f-8a25-469f-b959-4fbe91ae9035\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44nn9" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.462513 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/776e53fa-bf9e-44c4-8f89-2f78059733a7-serving-cert\") pod \"openshift-config-operator-7777fb866f-txdvl\" (UID: \"776e53fa-bf9e-44c4-8f89-2f78059733a7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-txdvl" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.462564 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mhsf\" (UniqueName: \"kubernetes.io/projected/4520b844-1a95-4300-8a10-5ef68e2067cb-kube-api-access-2mhsf\") pod \"machine-approver-56656f9798-xsgsc\" (UID: \"4520b844-1a95-4300-8a10-5ef68e2067cb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsgsc" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.462593 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/776e53fa-bf9e-44c4-8f89-2f78059733a7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-txdvl\" (UID: \"776e53fa-bf9e-44c4-8f89-2f78059733a7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-txdvl" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.462632 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0152808-f0b7-4ce4-9bc1-6bc11e69bd7f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2dx28\" (UID: \"a0152808-f0b7-4ce4-9bc1-6bc11e69bd7f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2dx28" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.462731 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-service-ca\") pod \"console-f9d7485db-5h9wr\" (UID: \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\") " pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.462756 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.462806 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcclz\" (UniqueName: \"kubernetes.io/projected/547872c0-29db-43b6-a531-14610127080d-kube-api-access-rcclz\") pod \"cluster-image-registry-operator-dc59b4c8b-4k9rd\" (UID: \"547872c0-29db-43b6-a531-14610127080d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k9rd" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.462875 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7adf273b-63fb-40fe-9d0d-fe467260565b-etcd-serving-ca\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.462911 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4520b844-1a95-4300-8a10-5ef68e2067cb-machine-approver-tls\") pod \"machine-approver-56656f9798-xsgsc\" (UID: \"4520b844-1a95-4300-8a10-5ef68e2067cb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsgsc" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.462966 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30f5e37c-8b52-4347-bc5e-a973ca06a7bf-config\") pod \"console-operator-58897d9998-gkww2\" (UID: \"30f5e37c-8b52-4347-bc5e-a973ca06a7bf\") " pod="openshift-console-operator/console-operator-58897d9998-gkww2" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.462988 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463010 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30f5e37c-8b52-4347-bc5e-a973ca06a7bf-serving-cert\") pod \"console-operator-58897d9998-gkww2\" (UID: \"30f5e37c-8b52-4347-bc5e-a973ca06a7bf\") " pod="openshift-console-operator/console-operator-58897d9998-gkww2" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463034 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7adf273b-63fb-40fe-9d0d-fe467260565b-serving-cert\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463063 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-audit-policies\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463074 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463082 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4520b844-1a95-4300-8a10-5ef68e2067cb-config\") pod \"machine-approver-56656f9798-xsgsc\" (UID: \"4520b844-1a95-4300-8a10-5ef68e2067cb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsgsc" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463110 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463136 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7adf273b-63fb-40fe-9d0d-fe467260565b-image-import-ca\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463158 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7adf273b-63fb-40fe-9d0d-fe467260565b-encryption-config\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463180 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463203 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7adf273b-63fb-40fe-9d0d-fe467260565b-config\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463224 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-oauth-serving-cert\") pod \"console-f9d7485db-5h9wr\" (UID: \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\") " pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463244 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463275 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-trusted-ca-bundle\") pod \"console-f9d7485db-5h9wr\" (UID: \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\") " pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463297 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6zcx\" (UniqueName: \"kubernetes.io/projected/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-kube-api-access-t6zcx\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463324 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbtkr\" (UniqueName: \"kubernetes.io/projected/7adf273b-63fb-40fe-9d0d-fe467260565b-kube-api-access-vbtkr\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463345 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463365 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/547872c0-29db-43b6-a531-14610127080d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4k9rd\" (UID: \"547872c0-29db-43b6-a531-14610127080d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k9rd" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463425 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/547872c0-29db-43b6-a531-14610127080d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4k9rd\" (UID: \"547872c0-29db-43b6-a531-14610127080d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k9rd" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463450 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463489 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0152808-f0b7-4ce4-9bc1-6bc11e69bd7f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2dx28\" (UID: \"a0152808-f0b7-4ce4-9bc1-6bc11e69bd7f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2dx28" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463514 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463559 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/448e552f-8a25-469f-b959-4fbe91ae9035-config\") pod \"openshift-apiserver-operator-796bbdcf4f-44nn9\" (UID: \"448e552f-8a25-469f-b959-4fbe91ae9035\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44nn9" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463578 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/448e552f-8a25-469f-b959-4fbe91ae9035-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-44nn9\" (UID: \"448e552f-8a25-469f-b959-4fbe91ae9035\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44nn9" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463604 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjbll\" (UniqueName: \"kubernetes.io/projected/776e53fa-bf9e-44c4-8f89-2f78059733a7-kube-api-access-vjbll\") pod \"openshift-config-operator-7777fb866f-txdvl\" (UID: \"776e53fa-bf9e-44c4-8f89-2f78059733a7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-txdvl" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463740 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7adf273b-63fb-40fe-9d0d-fe467260565b-node-pullsecrets\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463758 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7adf273b-63fb-40fe-9d0d-fe467260565b-etcd-client\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463821 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4520b844-1a95-4300-8a10-5ef68e2067cb-auth-proxy-config\") pod \"machine-approver-56656f9798-xsgsc\" (UID: \"4520b844-1a95-4300-8a10-5ef68e2067cb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsgsc" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463878 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ldsd\" (UniqueName: \"kubernetes.io/projected/d391f1fa-9bbe-478c-a1da-6ccb8f75f3c5-kube-api-access-6ldsd\") pod \"downloads-7954f5f757-htwjw\" (UID: \"d391f1fa-9bbe-478c-a1da-6ccb8f75f3c5\") " pod="openshift-console/downloads-7954f5f757-htwjw" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463898 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-audit-dir\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463927 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/30f5e37c-8b52-4347-bc5e-a973ca06a7bf-trusted-ca\") pod \"console-operator-58897d9998-gkww2\" (UID: \"30f5e37c-8b52-4347-bc5e-a973ca06a7bf\") " pod="openshift-console-operator/console-operator-58897d9998-gkww2" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463944 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7adf273b-63fb-40fe-9d0d-fe467260565b-audit\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463967 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-console-serving-cert\") pod \"console-f9d7485db-5h9wr\" (UID: \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\") " pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.463991 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/547872c0-29db-43b6-a531-14610127080d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4k9rd\" (UID: \"547872c0-29db-43b6-a531-14610127080d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k9rd" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.464018 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.464044 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7adf273b-63fb-40fe-9d0d-fe467260565b-audit-dir\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.464065 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-console-config\") pod \"console-f9d7485db-5h9wr\" (UID: \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\") " pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.464090 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn9xj\" (UniqueName: \"kubernetes.io/projected/99827392-eef9-4b43-ab05-d57f8bc8d3ef-kube-api-access-xn9xj\") pod \"cluster-samples-operator-665b6dd947-wl2fs\" (UID: \"99827392-eef9-4b43-ab05-d57f8bc8d3ef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wl2fs" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.464112 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvcxp\" (UniqueName: \"kubernetes.io/projected/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-kube-api-access-qvcxp\") pod \"console-f9d7485db-5h9wr\" (UID: \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\") " pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.464134 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/99827392-eef9-4b43-ab05-d57f8bc8d3ef-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wl2fs\" (UID: \"99827392-eef9-4b43-ab05-d57f8bc8d3ef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wl2fs" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.464157 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf8j9\" (UniqueName: \"kubernetes.io/projected/30f5e37c-8b52-4347-bc5e-a973ca06a7bf-kube-api-access-bf8j9\") pod \"console-operator-58897d9998-gkww2\" (UID: \"30f5e37c-8b52-4347-bc5e-a973ca06a7bf\") " pod="openshift-console-operator/console-operator-58897d9998-gkww2" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.464176 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0152808-f0b7-4ce4-9bc1-6bc11e69bd7f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2dx28\" (UID: \"a0152808-f0b7-4ce4-9bc1-6bc11e69bd7f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2dx28" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.464721 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7adf273b-63fb-40fe-9d0d-fe467260565b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.464741 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-console-oauth-config\") pod \"console-f9d7485db-5h9wr\" (UID: \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\") " pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.464787 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.465143 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/776e53fa-bf9e-44c4-8f89-2f78059733a7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-txdvl\" (UID: \"776e53fa-bf9e-44c4-8f89-2f78059733a7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-txdvl" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.465384 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ds2cw"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.465437 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416275-mfkvp"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.465571 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7adf273b-63fb-40fe-9d0d-fe467260565b-audit-dir\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.465781 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-audit-dir\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.465883 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7adf273b-63fb-40fe-9d0d-fe467260565b-node-pullsecrets\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.465904 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7adf273b-63fb-40fe-9d0d-fe467260565b-etcd-serving-ca\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.466466 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4520b844-1a95-4300-8a10-5ef68e2067cb-auth-proxy-config\") pod \"machine-approver-56656f9798-xsgsc\" (UID: \"4520b844-1a95-4300-8a10-5ef68e2067cb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsgsc" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.467041 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4520b844-1a95-4300-8a10-5ef68e2067cb-config\") pod \"machine-approver-56656f9798-xsgsc\" (UID: \"4520b844-1a95-4300-8a10-5ef68e2067cb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsgsc" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.467292 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30f5e37c-8b52-4347-bc5e-a973ca06a7bf-config\") pod \"console-operator-58897d9998-gkww2\" (UID: \"30f5e37c-8b52-4347-bc5e-a973ca06a7bf\") " pod="openshift-console-operator/console-operator-58897d9998-gkww2" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.468983 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-service-ca\") pod \"console-f9d7485db-5h9wr\" (UID: \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\") " pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.468985 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-console-config\") pod \"console-f9d7485db-5h9wr\" (UID: \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\") " pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.469087 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/30f5e37c-8b52-4347-bc5e-a973ca06a7bf-trusted-ca\") pod \"console-operator-58897d9998-gkww2\" (UID: \"30f5e37c-8b52-4347-bc5e-a973ca06a7bf\") " pod="openshift-console-operator/console-operator-58897d9998-gkww2" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.469142 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-audit-policies\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.469185 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-w5bmh"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.469208 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6l9sd"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.469527 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7adf273b-63fb-40fe-9d0d-fe467260565b-serving-cert\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.469181 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-trusted-ca-bundle\") pod \"console-f9d7485db-5h9wr\" (UID: \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\") " pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.469844 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6l9sd"] Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.469899 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/547872c0-29db-43b6-a531-14610127080d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4k9rd\" (UID: \"547872c0-29db-43b6-a531-14610127080d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k9rd" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.469931 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6l9sd" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.470033 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7adf273b-63fb-40fe-9d0d-fe467260565b-audit\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.470939 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0152808-f0b7-4ce4-9bc1-6bc11e69bd7f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2dx28\" (UID: \"a0152808-f0b7-4ce4-9bc1-6bc11e69bd7f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2dx28" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.471444 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7adf273b-63fb-40fe-9d0d-fe467260565b-config\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.471483 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.471601 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7adf273b-63fb-40fe-9d0d-fe467260565b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.471605 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30f5e37c-8b52-4347-bc5e-a973ca06a7bf-serving-cert\") pod \"console-operator-58897d9998-gkww2\" (UID: \"30f5e37c-8b52-4347-bc5e-a973ca06a7bf\") " pod="openshift-console-operator/console-operator-58897d9998-gkww2" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.471827 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/448e552f-8a25-469f-b959-4fbe91ae9035-config\") pod \"openshift-apiserver-operator-796bbdcf4f-44nn9\" (UID: \"448e552f-8a25-469f-b959-4fbe91ae9035\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44nn9" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.471896 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-oauth-serving-cert\") pod \"console-f9d7485db-5h9wr\" (UID: \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\") " pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.471942 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7adf273b-63fb-40fe-9d0d-fe467260565b-image-import-ca\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.472271 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.473871 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.474989 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0152808-f0b7-4ce4-9bc1-6bc11e69bd7f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2dx28\" (UID: \"a0152808-f0b7-4ce4-9bc1-6bc11e69bd7f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2dx28" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.475115 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7adf273b-63fb-40fe-9d0d-fe467260565b-etcd-client\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.475161 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.475353 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-console-serving-cert\") pod \"console-f9d7485db-5h9wr\" (UID: \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\") " pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.475557 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.475626 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4520b844-1a95-4300-8a10-5ef68e2067cb-machine-approver-tls\") pod \"machine-approver-56656f9798-xsgsc\" (UID: \"4520b844-1a95-4300-8a10-5ef68e2067cb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsgsc" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.476036 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/547872c0-29db-43b6-a531-14610127080d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4k9rd\" (UID: \"547872c0-29db-43b6-a531-14610127080d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k9rd" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.476284 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/448e552f-8a25-469f-b959-4fbe91ae9035-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-44nn9\" (UID: \"448e552f-8a25-469f-b959-4fbe91ae9035\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44nn9" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.478213 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-console-oauth-config\") pod \"console-f9d7485db-5h9wr\" (UID: \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\") " pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.480267 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/776e53fa-bf9e-44c4-8f89-2f78059733a7-serving-cert\") pod \"openshift-config-operator-7777fb866f-txdvl\" (UID: \"776e53fa-bf9e-44c4-8f89-2f78059733a7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-txdvl" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.480670 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.481719 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.483501 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.483773 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.484377 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7adf273b-63fb-40fe-9d0d-fe467260565b-encryption-config\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.486000 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.487432 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/99827392-eef9-4b43-ab05-d57f8bc8d3ef-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wl2fs\" (UID: \"99827392-eef9-4b43-ab05-d57f8bc8d3ef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wl2fs" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.497625 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.517137 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.538120 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.556904 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.576762 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.598778 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.613553 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.613554 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.613657 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.613650 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.617848 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.637432 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.656747 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.678040 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.717587 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8kkk\" (UniqueName: \"kubernetes.io/projected/376d1200-a143-4f81-9399-41fdafd1f0b1-kube-api-access-q8kkk\") pod \"controller-manager-879f6c89f-fsrfk\" (UID: \"376d1200-a143-4f81-9399-41fdafd1f0b1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fsrfk" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.730437 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl6ld\" (UniqueName: \"kubernetes.io/projected/b7ff8265-c71c-4b81-a8db-3b68a2118fd6-kube-api-access-tl6ld\") pod \"route-controller-manager-6576b87f9c-x5rwz\" (UID: \"b7ff8265-c71c-4b81-a8db-3b68a2118fd6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.751786 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znrjn\" (UniqueName: \"kubernetes.io/projected/74a8397f-0607-4761-9fc5-77e9a6d197c8-kube-api-access-znrjn\") pod \"machine-api-operator-5694c8668f-j6jsf\" (UID: \"74a8397f-0607-4761-9fc5-77e9a6d197c8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-j6jsf" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.757237 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.778285 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.798185 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.819198 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.829398 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-j6jsf" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.838377 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.868851 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.877151 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.896873 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.905101 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.918617 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.942229 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.958699 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.978387 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.981686 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fsrfk" Dec 05 23:21:56 crc kubenswrapper[4734]: I1205 23:21:56.999791 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.037767 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6jw7\" (UniqueName: \"kubernetes.io/projected/07e87699-6af3-4f68-a3c8-85780433774b-kube-api-access-r6jw7\") pod \"apiserver-7bbb656c7d-54qxz\" (UID: \"07e87699-6af3-4f68-a3c8-85780433774b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.037933 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.059674 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.077016 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.095188 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz"] Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.104107 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.117436 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.137734 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.149522 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fsrfk"] Dec 05 23:21:57 crc kubenswrapper[4734]: W1205 23:21:57.171228 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod376d1200_a143_4f81_9399_41fdafd1f0b1.slice/crio-60d67dc6b02aee9af46d24e4520f72f53517f8d5a6954d4f99fb039861c974b0 WatchSource:0}: Error finding container 60d67dc6b02aee9af46d24e4520f72f53517f8d5a6954d4f99fb039861c974b0: Status 404 returned error can't find the container with id 60d67dc6b02aee9af46d24e4520f72f53517f8d5a6954d4f99fb039861c974b0 Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.175916 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f9g7\" (UniqueName: \"kubernetes.io/projected/86bfea27-2a17-463f-9768-201c49599d74-kube-api-access-7f9g7\") pod \"authentication-operator-69f744f599-776fp\" (UID: \"86bfea27-2a17-463f-9768-201c49599d74\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-776fp" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.177156 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.192657 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.196649 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.217501 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.238069 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.257724 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-j6jsf"] Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.257819 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.277594 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.287622 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-776fp" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.298064 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.318337 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.337696 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.358867 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.376223 4734 request.go:700] Waited for 1.010708204s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dpprof-cert&limit=500&resourceVersion=0 Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.378602 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.399492 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.418099 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.423162 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fsrfk" event={"ID":"376d1200-a143-4f81-9399-41fdafd1f0b1","Type":"ContainerStarted","Data":"60d67dc6b02aee9af46d24e4520f72f53517f8d5a6954d4f99fb039861c974b0"} Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.424630 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz" event={"ID":"b7ff8265-c71c-4b81-a8db-3b68a2118fd6","Type":"ContainerStarted","Data":"86164c7afe8662c7faf0f731867ecf5fdb67d8fdc307b1bd32a896b65f6bc2ce"} Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.437477 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.457959 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.478316 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.498064 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.518173 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.537813 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.557810 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.578317 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.598776 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.618906 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.638824 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.658674 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.688856 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.698904 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.718629 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.738608 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.758518 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.778115 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.798410 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 05 23:21:57 crc kubenswrapper[4734]: W1205 23:21:57.820450 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74a8397f_0607_4761_9fc5_77e9a6d197c8.slice/crio-a70a026db5d36a1404f5c118f7389c3ad2793429614094e05726a40b9ec3aebb WatchSource:0}: Error finding container a70a026db5d36a1404f5c118f7389c3ad2793429614094e05726a40b9ec3aebb: Status 404 returned error can't find the container with id a70a026db5d36a1404f5c118f7389c3ad2793429614094e05726a40b9ec3aebb Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.820654 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.838308 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.858435 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.878205 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.899865 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.919289 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.941793 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.958680 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.978597 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 05 23:21:57 crc kubenswrapper[4734]: I1205 23:21:57.997700 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.021248 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.034832 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-776fp"] Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.037715 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 23:21:58 crc kubenswrapper[4734]: W1205 23:21:58.048895 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86bfea27_2a17_463f_9768_201c49599d74.slice/crio-fd2f8810d8462974c3f0b57bb4d27e6e07415ba9edd162ca3d918b1e86559d4d WatchSource:0}: Error finding container fd2f8810d8462974c3f0b57bb4d27e6e07415ba9edd162ca3d918b1e86559d4d: Status 404 returned error can't find the container with id fd2f8810d8462974c3f0b57bb4d27e6e07415ba9edd162ca3d918b1e86559d4d Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.061333 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.063393 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz"] Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.077269 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 05 23:21:58 crc kubenswrapper[4734]: W1205 23:21:58.093763 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07e87699_6af3_4f68_a3c8_85780433774b.slice/crio-96d37318aa709b539b753e35e67bac2dd01cdc8405085db2dc636155b877fde4 WatchSource:0}: Error finding container 96d37318aa709b539b753e35e67bac2dd01cdc8405085db2dc636155b877fde4: Status 404 returned error can't find the container with id 96d37318aa709b539b753e35e67bac2dd01cdc8405085db2dc636155b877fde4 Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.098033 4734 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.118842 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.138119 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.157363 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.177307 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.198669 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.217660 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.238213 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.314734 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mhsf\" (UniqueName: \"kubernetes.io/projected/4520b844-1a95-4300-8a10-5ef68e2067cb-kube-api-access-2mhsf\") pod \"machine-approver-56656f9798-xsgsc\" (UID: \"4520b844-1a95-4300-8a10-5ef68e2067cb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsgsc" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.318568 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4xnw\" (UniqueName: \"kubernetes.io/projected/448e552f-8a25-469f-b959-4fbe91ae9035-kube-api-access-m4xnw\") pod \"openshift-apiserver-operator-796bbdcf4f-44nn9\" (UID: \"448e552f-8a25-469f-b959-4fbe91ae9035\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44nn9" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.320361 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f4f948a0-bcd5-4e9e-86ec-0429082dac44-registry-tls\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.320588 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f4f948a0-bcd5-4e9e-86ec-0429082dac44-registry-certificates\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.320668 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4f948a0-bcd5-4e9e-86ec-0429082dac44-trusted-ca\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.320840 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4f948a0-bcd5-4e9e-86ec-0429082dac44-bound-sa-token\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.321013 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.321095 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgtxb\" (UniqueName: \"kubernetes.io/projected/f4f948a0-bcd5-4e9e-86ec-0429082dac44-kube-api-access-mgtxb\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.321252 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f4f948a0-bcd5-4e9e-86ec-0429082dac44-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.321309 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f4f948a0-bcd5-4e9e-86ec-0429082dac44-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.321530 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44nn9" Dec 05 23:21:58 crc kubenswrapper[4734]: E1205 23:21:58.323309 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:21:58.823284396 +0000 UTC m=+139.506688682 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.347506 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjbll\" (UniqueName: \"kubernetes.io/projected/776e53fa-bf9e-44c4-8f89-2f78059733a7-kube-api-access-vjbll\") pod \"openshift-config-operator-7777fb866f-txdvl\" (UID: \"776e53fa-bf9e-44c4-8f89-2f78059733a7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-txdvl" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.354060 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcclz\" (UniqueName: \"kubernetes.io/projected/547872c0-29db-43b6-a531-14610127080d-kube-api-access-rcclz\") pod \"cluster-image-registry-operator-dc59b4c8b-4k9rd\" (UID: \"547872c0-29db-43b6-a531-14610127080d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k9rd" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.386107 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn9xj\" (UniqueName: \"kubernetes.io/projected/99827392-eef9-4b43-ab05-d57f8bc8d3ef-kube-api-access-xn9xj\") pod \"cluster-samples-operator-665b6dd947-wl2fs\" (UID: \"99827392-eef9-4b43-ab05-d57f8bc8d3ef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wl2fs" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.394293 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ldsd\" (UniqueName: \"kubernetes.io/projected/d391f1fa-9bbe-478c-a1da-6ccb8f75f3c5-kube-api-access-6ldsd\") pod \"downloads-7954f5f757-htwjw\" (UID: \"d391f1fa-9bbe-478c-a1da-6ccb8f75f3c5\") " pod="openshift-console/downloads-7954f5f757-htwjw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.395731 4734 request.go:700] Waited for 1.929736965s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/serviceaccounts/console/token Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.417142 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvcxp\" (UniqueName: \"kubernetes.io/projected/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-kube-api-access-qvcxp\") pod \"console-f9d7485db-5h9wr\" (UID: \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\") " pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.421898 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.422064 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f4f948a0-bcd5-4e9e-86ec-0429082dac44-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.422206 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f4f948a0-bcd5-4e9e-86ec-0429082dac44-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.422257 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0663418a-98b9-48b4-869e-257a7bddd32e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-k9fv6\" (UID: \"0663418a-98b9-48b4-869e-257a7bddd32e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k9fv6" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.422283 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f4f948a0-bcd5-4e9e-86ec-0429082dac44-registry-tls\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.422309 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt4xx\" (UniqueName: \"kubernetes.io/projected/0663418a-98b9-48b4-869e-257a7bddd32e-kube-api-access-bt4xx\") pod \"openshift-controller-manager-operator-756b6f6bc6-k9fv6\" (UID: \"0663418a-98b9-48b4-869e-257a7bddd32e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k9fv6" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.422851 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f4f948a0-bcd5-4e9e-86ec-0429082dac44-registry-certificates\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.422893 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0663418a-98b9-48b4-869e-257a7bddd32e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-k9fv6\" (UID: \"0663418a-98b9-48b4-869e-257a7bddd32e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k9fv6" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.422934 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4f948a0-bcd5-4e9e-86ec-0429082dac44-trusted-ca\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.422984 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4f948a0-bcd5-4e9e-86ec-0429082dac44-bound-sa-token\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.423046 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgtxb\" (UniqueName: \"kubernetes.io/projected/f4f948a0-bcd5-4e9e-86ec-0429082dac44-kube-api-access-mgtxb\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:58 crc kubenswrapper[4734]: E1205 23:21:58.423333 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:21:58.923315214 +0000 UTC m=+139.606719500 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.424389 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f4f948a0-bcd5-4e9e-86ec-0429082dac44-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.425497 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f4f948a0-bcd5-4e9e-86ec-0429082dac44-registry-certificates\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.427458 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f4f948a0-bcd5-4e9e-86ec-0429082dac44-registry-tls\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.429158 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f4f948a0-bcd5-4e9e-86ec-0429082dac44-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.430453 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz" event={"ID":"b7ff8265-c71c-4b81-a8db-3b68a2118fd6","Type":"ContainerStarted","Data":"086fca1fc44c3e329041c16f083699c2d523501db3f50afa0a070e5ef6b3b4c4"} Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.431406 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.431940 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4f948a0-bcd5-4e9e-86ec-0429082dac44-trusted-ca\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.433244 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-j6jsf" event={"ID":"74a8397f-0607-4761-9fc5-77e9a6d197c8","Type":"ContainerStarted","Data":"ca8c9dde6926e44a2771499e7a7e7ff2dd5f23eb223cb3b7e046a0287254e273"} Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.433292 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-j6jsf" event={"ID":"74a8397f-0607-4761-9fc5-77e9a6d197c8","Type":"ContainerStarted","Data":"a70a026db5d36a1404f5c118f7389c3ad2793429614094e05726a40b9ec3aebb"} Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.433848 4734 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-x5rwz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.433884 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz" podUID="b7ff8265-c71c-4b81-a8db-3b68a2118fd6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.440859 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" event={"ID":"07e87699-6af3-4f68-a3c8-85780433774b","Type":"ContainerStarted","Data":"96d37318aa709b539b753e35e67bac2dd01cdc8405085db2dc636155b877fde4"} Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.442147 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-776fp" event={"ID":"86bfea27-2a17-463f-9768-201c49599d74","Type":"ContainerStarted","Data":"fd2f8810d8462974c3f0b57bb4d27e6e07415ba9edd162ca3d918b1e86559d4d"} Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.445142 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fsrfk" event={"ID":"376d1200-a143-4f81-9399-41fdafd1f0b1","Type":"ContainerStarted","Data":"e2286978561d0807de2e2e499ff921509cdd52a708d18a2c37e60899c32b92e8"} Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.445385 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-fsrfk" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.494955 4734 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-fsrfk container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.495022 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-fsrfk" podUID="376d1200-a143-4f81-9399-41fdafd1f0b1" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.498084 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6zcx\" (UniqueName: \"kubernetes.io/projected/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-kube-api-access-t6zcx\") pod \"oauth-openshift-558db77b4-gxdpj\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.503703 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsgsc" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.506853 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf8j9\" (UniqueName: \"kubernetes.io/projected/30f5e37c-8b52-4347-bc5e-a973ca06a7bf-kube-api-access-bf8j9\") pod \"console-operator-58897d9998-gkww2\" (UID: \"30f5e37c-8b52-4347-bc5e-a973ca06a7bf\") " pod="openshift-console-operator/console-operator-58897d9998-gkww2" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.507976 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/547872c0-29db-43b6-a531-14610127080d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4k9rd\" (UID: \"547872c0-29db-43b6-a531-14610127080d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k9rd" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.509644 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0152808-f0b7-4ce4-9bc1-6bc11e69bd7f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2dx28\" (UID: \"a0152808-f0b7-4ce4-9bc1-6bc11e69bd7f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2dx28" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.511345 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.514381 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbtkr\" (UniqueName: \"kubernetes.io/projected/7adf273b-63fb-40fe-9d0d-fe467260565b-kube-api-access-vbtkr\") pod \"apiserver-76f77b778f-x67qn\" (UID: \"7adf273b-63fb-40fe-9d0d-fe467260565b\") " pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.521517 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-gkww2" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.521699 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.524762 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7r2n\" (UniqueName: \"kubernetes.io/projected/8f1fb91a-4e37-4bad-baa4-4996c7dd06e8-kube-api-access-l7r2n\") pod \"service-ca-9c57cc56f-ds2cw\" (UID: \"8f1fb91a-4e37-4bad-baa4-4996c7dd06e8\") " pod="openshift-service-ca/service-ca-9c57cc56f-ds2cw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.524805 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbjd2\" (UniqueName: \"kubernetes.io/projected/d5a2bf75-6d4d-40e9-a4d2-0aa192d25cc8-kube-api-access-cbjd2\") pod \"machine-config-server-mghs5\" (UID: \"d5a2bf75-6d4d-40e9-a4d2-0aa192d25cc8\") " pod="openshift-machine-config-operator/machine-config-server-mghs5" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.524859 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0de1aed3-e393-4d4f-b201-12142736c664-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mrqvs\" (UID: \"0de1aed3-e393-4d4f-b201-12142736c664\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrqvs" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.524878 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/de8d0e19-aece-4044-9eb8-ede1e5edda45-images\") pod \"machine-config-operator-74547568cd-d9vxl\" (UID: \"de8d0e19-aece-4044-9eb8-ede1e5edda45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9vxl" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.524918 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8f1fb91a-4e37-4bad-baa4-4996c7dd06e8-signing-key\") pod \"service-ca-9c57cc56f-ds2cw\" (UID: \"8f1fb91a-4e37-4bad-baa4-4996c7dd06e8\") " pod="openshift-service-ca/service-ca-9c57cc56f-ds2cw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.524938 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppnph\" (UniqueName: \"kubernetes.io/projected/c8fec47c-0cfe-45b1-9f45-5eba4c924359-kube-api-access-ppnph\") pod \"csi-hostpathplugin-w5bmh\" (UID: \"c8fec47c-0cfe-45b1-9f45-5eba4c924359\") " pod="hostpath-provisioner/csi-hostpathplugin-w5bmh" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.524957 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwnhv\" (UniqueName: \"kubernetes.io/projected/36efc3c0-8de6-423d-bb0e-c76488f53955-kube-api-access-qwnhv\") pod \"machine-config-controller-84d6567774-dqz9r\" (UID: \"36efc3c0-8de6-423d-bb0e-c76488f53955\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dqz9r" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.525018 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/36efc3c0-8de6-423d-bb0e-c76488f53955-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dqz9r\" (UID: \"36efc3c0-8de6-423d-bb0e-c76488f53955\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dqz9r" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.525042 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82e59745-0a0a-4c7c-a61b-d801aed4d11d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-qqvvd\" (UID: \"82e59745-0a0a-4c7c-a61b-d801aed4d11d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qqvvd" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.525062 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5069f4e4-270c-4fa2-9121-e6da86b389d1-serving-cert\") pod \"etcd-operator-b45778765-qmqbr\" (UID: \"5069f4e4-270c-4fa2-9121-e6da86b389d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qmqbr" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.525082 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eae91bc6-fbec-4bb5-81f7-254dc473427e-metrics-tls\") pod \"dns-default-4htrx\" (UID: \"eae91bc6-fbec-4bb5-81f7-254dc473427e\") " pod="openshift-dns/dns-default-4htrx" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.525102 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a58228cf-5189-4c26-b772-a1c2145873a0-metrics-tls\") pod \"dns-operator-744455d44c-svcq4\" (UID: \"a58228cf-5189-4c26-b772-a1c2145873a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-svcq4" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.525126 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8lfx\" (UniqueName: \"kubernetes.io/projected/3be50f08-5e27-434b-8862-52c075569d6d-kube-api-access-d8lfx\") pod \"ingress-canary-6l9sd\" (UID: \"3be50f08-5e27-434b-8862-52c075569d6d\") " pod="openshift-ingress-canary/ingress-canary-6l9sd" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.525270 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8251ea29-3180-4d6c-a6f7-6477bcd8ed6f-metrics-certs\") pod \"router-default-5444994796-2djpb\" (UID: \"8251ea29-3180-4d6c-a6f7-6477bcd8ed6f\") " pod="openshift-ingress/router-default-5444994796-2djpb" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.525509 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5069f4e4-270c-4fa2-9121-e6da86b389d1-config\") pod \"etcd-operator-b45778765-qmqbr\" (UID: \"5069f4e4-270c-4fa2-9121-e6da86b389d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qmqbr" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.525576 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb389c3f-69cc-49bd-b413-3fbbf370ad41-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zwcnb\" (UID: \"eb389c3f-69cc-49bd-b413-3fbbf370ad41\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zwcnb" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.525666 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgvw5\" (UniqueName: \"kubernetes.io/projected/eae91bc6-fbec-4bb5-81f7-254dc473427e-kube-api-access-zgvw5\") pod \"dns-default-4htrx\" (UID: \"eae91bc6-fbec-4bb5-81f7-254dc473427e\") " pod="openshift-dns/dns-default-4htrx" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.525694 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d5a2bf75-6d4d-40e9-a4d2-0aa192d25cc8-node-bootstrap-token\") pod \"machine-config-server-mghs5\" (UID: \"d5a2bf75-6d4d-40e9-a4d2-0aa192d25cc8\") " pod="openshift-machine-config-operator/machine-config-server-mghs5" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.525763 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m98bg\" (UniqueName: \"kubernetes.io/projected/0cd8fc51-deec-410b-b2bb-4818c2f71230-kube-api-access-m98bg\") pod \"marketplace-operator-79b997595-ws6qt\" (UID: \"0cd8fc51-deec-410b-b2bb-4818c2f71230\") " pod="openshift-marketplace/marketplace-operator-79b997595-ws6qt" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.525819 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/de8d0e19-aece-4044-9eb8-ede1e5edda45-auth-proxy-config\") pod \"machine-config-operator-74547568cd-d9vxl\" (UID: \"de8d0e19-aece-4044-9eb8-ede1e5edda45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9vxl" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.525846 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24632d76-79cd-400b-bfad-a4c8a0ffbb68-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9lr2g\" (UID: \"24632d76-79cd-400b-bfad-a4c8a0ffbb68\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lr2g" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.526186 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c8fec47c-0cfe-45b1-9f45-5eba4c924359-socket-dir\") pod \"csi-hostpathplugin-w5bmh\" (UID: \"c8fec47c-0cfe-45b1-9f45-5eba4c924359\") " pod="hostpath-provisioner/csi-hostpathplugin-w5bmh" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.526303 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/05c3d993-bbb4-4f67-8952-23d9b107b889-apiservice-cert\") pod \"packageserver-d55dfcdfc-4jlxs\" (UID: \"05c3d993-bbb4-4f67-8952-23d9b107b889\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jlxs" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.526464 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a20dbad-8352-4804-9c0e-a2b6108a0d1b-secret-volume\") pod \"collect-profiles-29416275-mfkvp\" (UID: \"2a20dbad-8352-4804-9c0e-a2b6108a0d1b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-mfkvp" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.527935 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e08ddd6-cfa7-4e6b-902f-d789f91fd70b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kkt7h\" (UID: \"9e08ddd6-cfa7-4e6b-902f-d789f91fd70b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kkt7h" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.528126 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb389c3f-69cc-49bd-b413-3fbbf370ad41-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zwcnb\" (UID: \"eb389c3f-69cc-49bd-b413-3fbbf370ad41\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zwcnb" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.528175 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de8d0e19-aece-4044-9eb8-ede1e5edda45-proxy-tls\") pod \"machine-config-operator-74547568cd-d9vxl\" (UID: \"de8d0e19-aece-4044-9eb8-ede1e5edda45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9vxl" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.528774 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd8ws\" (UniqueName: \"kubernetes.io/projected/2a20dbad-8352-4804-9c0e-a2b6108a0d1b-kube-api-access-gd8ws\") pod \"collect-profiles-29416275-mfkvp\" (UID: \"2a20dbad-8352-4804-9c0e-a2b6108a0d1b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-mfkvp" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.528795 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a20501a9-7a2f-46ba-8322-a9b38d14bb4a-profile-collector-cert\") pod \"catalog-operator-68c6474976-4wmxp\" (UID: \"a20501a9-7a2f-46ba-8322-a9b38d14bb4a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wmxp" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.528834 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e08ddd6-cfa7-4e6b-902f-d789f91fd70b-trusted-ca\") pod \"ingress-operator-5b745b69d9-kkt7h\" (UID: \"9e08ddd6-cfa7-4e6b-902f-d789f91fd70b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kkt7h" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.528855 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szzhz\" (UniqueName: \"kubernetes.io/projected/0de1aed3-e393-4d4f-b201-12142736c664-kube-api-access-szzhz\") pod \"olm-operator-6b444d44fb-mrqvs\" (UID: \"0de1aed3-e393-4d4f-b201-12142736c664\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrqvs" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.528897 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fngsb\" (UniqueName: \"kubernetes.io/projected/2e6d7ac5-5f33-4561-a79b-685f9ae74144-kube-api-access-fngsb\") pod \"package-server-manager-789f6589d5-tsb97\" (UID: \"2e6d7ac5-5f33-4561-a79b-685f9ae74144\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tsb97" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.528919 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c8fec47c-0cfe-45b1-9f45-5eba4c924359-mountpoint-dir\") pod \"csi-hostpathplugin-w5bmh\" (UID: \"c8fec47c-0cfe-45b1-9f45-5eba4c924359\") " pod="hostpath-provisioner/csi-hostpathplugin-w5bmh" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.528942 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbrws\" (UniqueName: \"kubernetes.io/projected/8251ea29-3180-4d6c-a6f7-6477bcd8ed6f-kube-api-access-rbrws\") pod \"router-default-5444994796-2djpb\" (UID: \"8251ea29-3180-4d6c-a6f7-6477bcd8ed6f\") " pod="openshift-ingress/router-default-5444994796-2djpb" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.528987 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a20dbad-8352-4804-9c0e-a2b6108a0d1b-config-volume\") pod \"collect-profiles-29416275-mfkvp\" (UID: \"2a20dbad-8352-4804-9c0e-a2b6108a0d1b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-mfkvp" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.529024 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5069f4e4-270c-4fa2-9121-e6da86b389d1-etcd-client\") pod \"etcd-operator-b45778765-qmqbr\" (UID: \"5069f4e4-270c-4fa2-9121-e6da86b389d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qmqbr" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.529047 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9e08ddd6-cfa7-4e6b-902f-d789f91fd70b-metrics-tls\") pod \"ingress-operator-5b745b69d9-kkt7h\" (UID: \"9e08ddd6-cfa7-4e6b-902f-d789f91fd70b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kkt7h" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.529140 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl2rb\" (UniqueName: \"kubernetes.io/projected/5069f4e4-270c-4fa2-9121-e6da86b389d1-kube-api-access-sl2rb\") pod \"etcd-operator-b45778765-qmqbr\" (UID: \"5069f4e4-270c-4fa2-9121-e6da86b389d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qmqbr" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.529167 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dnsc\" (UniqueName: \"kubernetes.io/projected/eb3de870-5133-4874-bc83-37be5f299296-kube-api-access-8dnsc\") pod \"migrator-59844c95c7-n9fhz\" (UID: \"eb3de870-5133-4874-bc83-37be5f299296\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-n9fhz" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.529204 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clcxc\" (UniqueName: \"kubernetes.io/projected/64b05eab-74bd-43bd-b206-54f4e784e581-kube-api-access-clcxc\") pod \"multus-admission-controller-857f4d67dd-m6c2b\" (UID: \"64b05eab-74bd-43bd-b206-54f4e784e581\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m6c2b" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.529234 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24632d76-79cd-400b-bfad-a4c8a0ffbb68-config\") pod \"kube-apiserver-operator-766d6c64bb-9lr2g\" (UID: \"24632d76-79cd-400b-bfad-a4c8a0ffbb68\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lr2g" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.530677 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0663418a-98b9-48b4-869e-257a7bddd32e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-k9fv6\" (UID: \"0663418a-98b9-48b4-869e-257a7bddd32e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k9fv6" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.530711 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/36efc3c0-8de6-423d-bb0e-c76488f53955-proxy-tls\") pod \"machine-config-controller-84d6567774-dqz9r\" (UID: \"36efc3c0-8de6-423d-bb0e-c76488f53955\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dqz9r" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.530753 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24632d76-79cd-400b-bfad-a4c8a0ffbb68-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9lr2g\" (UID: \"24632d76-79cd-400b-bfad-a4c8a0ffbb68\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lr2g" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.531482 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8251ea29-3180-4d6c-a6f7-6477bcd8ed6f-service-ca-bundle\") pod \"router-default-5444994796-2djpb\" (UID: \"8251ea29-3180-4d6c-a6f7-6477bcd8ed6f\") " pod="openshift-ingress/router-default-5444994796-2djpb" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.531518 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0cd8fc51-deec-410b-b2bb-4818c2f71230-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ws6qt\" (UID: \"0cd8fc51-deec-410b-b2bb-4818c2f71230\") " pod="openshift-marketplace/marketplace-operator-79b997595-ws6qt" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.531597 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7b7cfcc6-ee0e-4af5-9e03-5d8cbb40edbb-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qvhvd\" (UID: \"7b7cfcc6-ee0e-4af5-9e03-5d8cbb40edbb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qvhvd" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.531657 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2e6d7ac5-5f33-4561-a79b-685f9ae74144-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tsb97\" (UID: \"2e6d7ac5-5f33-4561-a79b-685f9ae74144\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tsb97" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.531706 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb389c3f-69cc-49bd-b413-3fbbf370ad41-config\") pod \"kube-controller-manager-operator-78b949d7b-zwcnb\" (UID: \"eb389c3f-69cc-49bd-b413-3fbbf370ad41\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zwcnb" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.532867 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/64b05eab-74bd-43bd-b206-54f4e784e581-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-m6c2b\" (UID: \"64b05eab-74bd-43bd-b206-54f4e784e581\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m6c2b" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.532891 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2c4t\" (UniqueName: \"kubernetes.io/projected/05c3d993-bbb4-4f67-8952-23d9b107b889-kube-api-access-n2c4t\") pod \"packageserver-d55dfcdfc-4jlxs\" (UID: \"05c3d993-bbb4-4f67-8952-23d9b107b889\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jlxs" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.532931 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0663418a-98b9-48b4-869e-257a7bddd32e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-k9fv6\" (UID: \"0663418a-98b9-48b4-869e-257a7bddd32e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k9fv6" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.532984 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6np8v\" (UniqueName: \"kubernetes.io/projected/9e08ddd6-cfa7-4e6b-902f-d789f91fd70b-kube-api-access-6np8v\") pod \"ingress-operator-5b745b69d9-kkt7h\" (UID: \"9e08ddd6-cfa7-4e6b-902f-d789f91fd70b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kkt7h" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.533039 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/05c3d993-bbb4-4f67-8952-23d9b107b889-webhook-cert\") pod \"packageserver-d55dfcdfc-4jlxs\" (UID: \"05c3d993-bbb4-4f67-8952-23d9b107b889\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jlxs" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.533148 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c8fec47c-0cfe-45b1-9f45-5eba4c924359-registration-dir\") pod \"csi-hostpathplugin-w5bmh\" (UID: \"c8fec47c-0cfe-45b1-9f45-5eba4c924359\") " pod="hostpath-provisioner/csi-hostpathplugin-w5bmh" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.533168 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92mnn\" (UniqueName: \"kubernetes.io/projected/de8d0e19-aece-4044-9eb8-ede1e5edda45-kube-api-access-92mnn\") pod \"machine-config-operator-74547568cd-d9vxl\" (UID: \"de8d0e19-aece-4044-9eb8-ede1e5edda45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9vxl" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.533233 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.533323 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0663418a-98b9-48b4-869e-257a7bddd32e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-k9fv6\" (UID: \"0663418a-98b9-48b4-869e-257a7bddd32e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k9fv6" Dec 05 23:21:58 crc kubenswrapper[4734]: E1205 23:21:58.535118 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:21:59.035094712 +0000 UTC m=+139.718499188 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.535396 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c8fec47c-0cfe-45b1-9f45-5eba4c924359-csi-data-dir\") pod \"csi-hostpathplugin-w5bmh\" (UID: \"c8fec47c-0cfe-45b1-9f45-5eba4c924359\") " pod="hostpath-provisioner/csi-hostpathplugin-w5bmh" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.535962 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c8fec47c-0cfe-45b1-9f45-5eba4c924359-plugins-dir\") pod \"csi-hostpathplugin-w5bmh\" (UID: \"c8fec47c-0cfe-45b1-9f45-5eba4c924359\") " pod="hostpath-provisioner/csi-hostpathplugin-w5bmh" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.536050 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5069f4e4-270c-4fa2-9121-e6da86b389d1-etcd-service-ca\") pod \"etcd-operator-b45778765-qmqbr\" (UID: \"5069f4e4-270c-4fa2-9121-e6da86b389d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qmqbr" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.536096 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17750e75-596f-4637-a240-55aabb725a86-serving-cert\") pod \"service-ca-operator-777779d784-ckxzz\" (UID: \"17750e75-596f-4637-a240-55aabb725a86\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ckxzz" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.536119 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxsbj\" (UniqueName: \"kubernetes.io/projected/a58228cf-5189-4c26-b772-a1c2145873a0-kube-api-access-hxsbj\") pod \"dns-operator-744455d44c-svcq4\" (UID: \"a58228cf-5189-4c26-b772-a1c2145873a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-svcq4" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.536143 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdmfb\" (UniqueName: \"kubernetes.io/projected/7b7cfcc6-ee0e-4af5-9e03-5d8cbb40edbb-kube-api-access-qdmfb\") pod \"control-plane-machine-set-operator-78cbb6b69f-qvhvd\" (UID: \"7b7cfcc6-ee0e-4af5-9e03-5d8cbb40edbb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qvhvd" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.536213 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eae91bc6-fbec-4bb5-81f7-254dc473427e-config-volume\") pod \"dns-default-4htrx\" (UID: \"eae91bc6-fbec-4bb5-81f7-254dc473427e\") " pod="openshift-dns/dns-default-4htrx" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.536238 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/05c3d993-bbb4-4f67-8952-23d9b107b889-tmpfs\") pod \"packageserver-d55dfcdfc-4jlxs\" (UID: \"05c3d993-bbb4-4f67-8952-23d9b107b889\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jlxs" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.536295 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8251ea29-3180-4d6c-a6f7-6477bcd8ed6f-stats-auth\") pod \"router-default-5444994796-2djpb\" (UID: \"8251ea29-3180-4d6c-a6f7-6477bcd8ed6f\") " pod="openshift-ingress/router-default-5444994796-2djpb" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.538734 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-txdvl" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.539058 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.541380 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scbt4\" (UniqueName: \"kubernetes.io/projected/82e59745-0a0a-4c7c-a61b-d801aed4d11d-kube-api-access-scbt4\") pod \"kube-storage-version-migrator-operator-b67b599dd-qqvvd\" (UID: \"82e59745-0a0a-4c7c-a61b-d801aed4d11d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qqvvd" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.541462 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a20501a9-7a2f-46ba-8322-a9b38d14bb4a-srv-cert\") pod \"catalog-operator-68c6474976-4wmxp\" (UID: \"a20501a9-7a2f-46ba-8322-a9b38d14bb4a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wmxp" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.542196 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8251ea29-3180-4d6c-a6f7-6477bcd8ed6f-default-certificate\") pod \"router-default-5444994796-2djpb\" (UID: \"8251ea29-3180-4d6c-a6f7-6477bcd8ed6f\") " pod="openshift-ingress/router-default-5444994796-2djpb" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.542223 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8f1fb91a-4e37-4bad-baa4-4996c7dd06e8-signing-cabundle\") pod \"service-ca-9c57cc56f-ds2cw\" (UID: \"8f1fb91a-4e37-4bad-baa4-4996c7dd06e8\") " pod="openshift-service-ca/service-ca-9c57cc56f-ds2cw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.542267 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d5a2bf75-6d4d-40e9-a4d2-0aa192d25cc8-certs\") pod \"machine-config-server-mghs5\" (UID: \"d5a2bf75-6d4d-40e9-a4d2-0aa192d25cc8\") " pod="openshift-machine-config-operator/machine-config-server-mghs5" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.542304 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5069f4e4-270c-4fa2-9121-e6da86b389d1-etcd-ca\") pod \"etcd-operator-b45778765-qmqbr\" (UID: \"5069f4e4-270c-4fa2-9121-e6da86b389d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qmqbr" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.542626 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17750e75-596f-4637-a240-55aabb725a86-config\") pod \"service-ca-operator-777779d784-ckxzz\" (UID: \"17750e75-596f-4637-a240-55aabb725a86\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ckxzz" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.543552 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82e59745-0a0a-4c7c-a61b-d801aed4d11d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-qqvvd\" (UID: \"82e59745-0a0a-4c7c-a61b-d801aed4d11d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qqvvd" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.543593 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3be50f08-5e27-434b-8862-52c075569d6d-cert\") pod \"ingress-canary-6l9sd\" (UID: \"3be50f08-5e27-434b-8862-52c075569d6d\") " pod="openshift-ingress-canary/ingress-canary-6l9sd" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.543634 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc5ql\" (UniqueName: \"kubernetes.io/projected/a20501a9-7a2f-46ba-8322-a9b38d14bb4a-kube-api-access-pc5ql\") pod \"catalog-operator-68c6474976-4wmxp\" (UID: \"a20501a9-7a2f-46ba-8322-a9b38d14bb4a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wmxp" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.543716 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt4xx\" (UniqueName: \"kubernetes.io/projected/0663418a-98b9-48b4-869e-257a7bddd32e-kube-api-access-bt4xx\") pod \"openshift-controller-manager-operator-756b6f6bc6-k9fv6\" (UID: \"0663418a-98b9-48b4-869e-257a7bddd32e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k9fv6" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.543778 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0cd8fc51-deec-410b-b2bb-4818c2f71230-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ws6qt\" (UID: \"0cd8fc51-deec-410b-b2bb-4818c2f71230\") " pod="openshift-marketplace/marketplace-operator-79b997595-ws6qt" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.543803 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7848p\" (UniqueName: \"kubernetes.io/projected/17750e75-596f-4637-a240-55aabb725a86-kube-api-access-7848p\") pod \"service-ca-operator-777779d784-ckxzz\" (UID: \"17750e75-596f-4637-a240-55aabb725a86\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ckxzz" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.543849 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0de1aed3-e393-4d4f-b201-12142736c664-srv-cert\") pod \"olm-operator-6b444d44fb-mrqvs\" (UID: \"0de1aed3-e393-4d4f-b201-12142736c664\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrqvs" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.545015 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-htwjw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.546137 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0663418a-98b9-48b4-869e-257a7bddd32e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-k9fv6\" (UID: \"0663418a-98b9-48b4-869e-257a7bddd32e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k9fv6" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.553818 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wl2fs" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.569057 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.576342 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k9rd" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.578161 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.580704 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.626206 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.628925 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44nn9"] Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.629442 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2dx28" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.633348 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.639749 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.644478 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:21:58 crc kubenswrapper[4734]: W1205 23:21:58.646863 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod448e552f_8a25_469f_b959_4fbe91ae9035.slice/crio-529cbb5d3a7af5792eae4113bd6e901da67805ba6ca43a1ddf6295a3f6bec735 WatchSource:0}: Error finding container 529cbb5d3a7af5792eae4113bd6e901da67805ba6ca43a1ddf6295a3f6bec735: Status 404 returned error can't find the container with id 529cbb5d3a7af5792eae4113bd6e901da67805ba6ca43a1ddf6295a3f6bec735 Dec 05 23:21:58 crc kubenswrapper[4734]: E1205 23:21:58.647058 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:21:59.147032191 +0000 UTC m=+139.830436467 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652158 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a20dbad-8352-4804-9c0e-a2b6108a0d1b-secret-volume\") pod \"collect-profiles-29416275-mfkvp\" (UID: \"2a20dbad-8352-4804-9c0e-a2b6108a0d1b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-mfkvp" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652195 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e08ddd6-cfa7-4e6b-902f-d789f91fd70b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kkt7h\" (UID: \"9e08ddd6-cfa7-4e6b-902f-d789f91fd70b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kkt7h" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652215 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de8d0e19-aece-4044-9eb8-ede1e5edda45-proxy-tls\") pod \"machine-config-operator-74547568cd-d9vxl\" (UID: \"de8d0e19-aece-4044-9eb8-ede1e5edda45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9vxl" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652250 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb389c3f-69cc-49bd-b413-3fbbf370ad41-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zwcnb\" (UID: \"eb389c3f-69cc-49bd-b413-3fbbf370ad41\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zwcnb" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652275 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd8ws\" (UniqueName: \"kubernetes.io/projected/2a20dbad-8352-4804-9c0e-a2b6108a0d1b-kube-api-access-gd8ws\") pod \"collect-profiles-29416275-mfkvp\" (UID: \"2a20dbad-8352-4804-9c0e-a2b6108a0d1b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-mfkvp" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652294 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a20501a9-7a2f-46ba-8322-a9b38d14bb4a-profile-collector-cert\") pod \"catalog-operator-68c6474976-4wmxp\" (UID: \"a20501a9-7a2f-46ba-8322-a9b38d14bb4a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wmxp" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652333 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fngsb\" (UniqueName: \"kubernetes.io/projected/2e6d7ac5-5f33-4561-a79b-685f9ae74144-kube-api-access-fngsb\") pod \"package-server-manager-789f6589d5-tsb97\" (UID: \"2e6d7ac5-5f33-4561-a79b-685f9ae74144\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tsb97" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652353 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e08ddd6-cfa7-4e6b-902f-d789f91fd70b-trusted-ca\") pod \"ingress-operator-5b745b69d9-kkt7h\" (UID: \"9e08ddd6-cfa7-4e6b-902f-d789f91fd70b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kkt7h" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652376 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szzhz\" (UniqueName: \"kubernetes.io/projected/0de1aed3-e393-4d4f-b201-12142736c664-kube-api-access-szzhz\") pod \"olm-operator-6b444d44fb-mrqvs\" (UID: \"0de1aed3-e393-4d4f-b201-12142736c664\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrqvs" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652401 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c8fec47c-0cfe-45b1-9f45-5eba4c924359-mountpoint-dir\") pod \"csi-hostpathplugin-w5bmh\" (UID: \"c8fec47c-0cfe-45b1-9f45-5eba4c924359\") " pod="hostpath-provisioner/csi-hostpathplugin-w5bmh" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652420 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbrws\" (UniqueName: \"kubernetes.io/projected/8251ea29-3180-4d6c-a6f7-6477bcd8ed6f-kube-api-access-rbrws\") pod \"router-default-5444994796-2djpb\" (UID: \"8251ea29-3180-4d6c-a6f7-6477bcd8ed6f\") " pod="openshift-ingress/router-default-5444994796-2djpb" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652437 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a20dbad-8352-4804-9c0e-a2b6108a0d1b-config-volume\") pod \"collect-profiles-29416275-mfkvp\" (UID: \"2a20dbad-8352-4804-9c0e-a2b6108a0d1b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-mfkvp" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652465 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5069f4e4-270c-4fa2-9121-e6da86b389d1-etcd-client\") pod \"etcd-operator-b45778765-qmqbr\" (UID: \"5069f4e4-270c-4fa2-9121-e6da86b389d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qmqbr" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652482 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9e08ddd6-cfa7-4e6b-902f-d789f91fd70b-metrics-tls\") pod \"ingress-operator-5b745b69d9-kkt7h\" (UID: \"9e08ddd6-cfa7-4e6b-902f-d789f91fd70b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kkt7h" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652515 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl2rb\" (UniqueName: \"kubernetes.io/projected/5069f4e4-270c-4fa2-9121-e6da86b389d1-kube-api-access-sl2rb\") pod \"etcd-operator-b45778765-qmqbr\" (UID: \"5069f4e4-270c-4fa2-9121-e6da86b389d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qmqbr" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652553 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dnsc\" (UniqueName: \"kubernetes.io/projected/eb3de870-5133-4874-bc83-37be5f299296-kube-api-access-8dnsc\") pod \"migrator-59844c95c7-n9fhz\" (UID: \"eb3de870-5133-4874-bc83-37be5f299296\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-n9fhz" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652577 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clcxc\" (UniqueName: \"kubernetes.io/projected/64b05eab-74bd-43bd-b206-54f4e784e581-kube-api-access-clcxc\") pod \"multus-admission-controller-857f4d67dd-m6c2b\" (UID: \"64b05eab-74bd-43bd-b206-54f4e784e581\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m6c2b" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652607 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24632d76-79cd-400b-bfad-a4c8a0ffbb68-config\") pod \"kube-apiserver-operator-766d6c64bb-9lr2g\" (UID: \"24632d76-79cd-400b-bfad-a4c8a0ffbb68\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lr2g" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652630 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/36efc3c0-8de6-423d-bb0e-c76488f53955-proxy-tls\") pod \"machine-config-controller-84d6567774-dqz9r\" (UID: \"36efc3c0-8de6-423d-bb0e-c76488f53955\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dqz9r" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652647 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24632d76-79cd-400b-bfad-a4c8a0ffbb68-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9lr2g\" (UID: \"24632d76-79cd-400b-bfad-a4c8a0ffbb68\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lr2g" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652667 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8251ea29-3180-4d6c-a6f7-6477bcd8ed6f-service-ca-bundle\") pod \"router-default-5444994796-2djpb\" (UID: \"8251ea29-3180-4d6c-a6f7-6477bcd8ed6f\") " pod="openshift-ingress/router-default-5444994796-2djpb" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652685 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0cd8fc51-deec-410b-b2bb-4818c2f71230-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ws6qt\" (UID: \"0cd8fc51-deec-410b-b2bb-4818c2f71230\") " pod="openshift-marketplace/marketplace-operator-79b997595-ws6qt" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652703 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7b7cfcc6-ee0e-4af5-9e03-5d8cbb40edbb-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qvhvd\" (UID: \"7b7cfcc6-ee0e-4af5-9e03-5d8cbb40edbb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qvhvd" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652722 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2e6d7ac5-5f33-4561-a79b-685f9ae74144-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tsb97\" (UID: \"2e6d7ac5-5f33-4561-a79b-685f9ae74144\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tsb97" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652740 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb389c3f-69cc-49bd-b413-3fbbf370ad41-config\") pod \"kube-controller-manager-operator-78b949d7b-zwcnb\" (UID: \"eb389c3f-69cc-49bd-b413-3fbbf370ad41\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zwcnb" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652768 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/64b05eab-74bd-43bd-b206-54f4e784e581-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-m6c2b\" (UID: \"64b05eab-74bd-43bd-b206-54f4e784e581\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m6c2b" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652793 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2c4t\" (UniqueName: \"kubernetes.io/projected/05c3d993-bbb4-4f67-8952-23d9b107b889-kube-api-access-n2c4t\") pod \"packageserver-d55dfcdfc-4jlxs\" (UID: \"05c3d993-bbb4-4f67-8952-23d9b107b889\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jlxs" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652821 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6np8v\" (UniqueName: \"kubernetes.io/projected/9e08ddd6-cfa7-4e6b-902f-d789f91fd70b-kube-api-access-6np8v\") pod \"ingress-operator-5b745b69d9-kkt7h\" (UID: \"9e08ddd6-cfa7-4e6b-902f-d789f91fd70b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kkt7h" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652839 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/05c3d993-bbb4-4f67-8952-23d9b107b889-webhook-cert\") pod \"packageserver-d55dfcdfc-4jlxs\" (UID: \"05c3d993-bbb4-4f67-8952-23d9b107b889\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jlxs" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652872 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c8fec47c-0cfe-45b1-9f45-5eba4c924359-registration-dir\") pod \"csi-hostpathplugin-w5bmh\" (UID: \"c8fec47c-0cfe-45b1-9f45-5eba4c924359\") " pod="hostpath-provisioner/csi-hostpathplugin-w5bmh" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652897 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92mnn\" (UniqueName: \"kubernetes.io/projected/de8d0e19-aece-4044-9eb8-ede1e5edda45-kube-api-access-92mnn\") pod \"machine-config-operator-74547568cd-d9vxl\" (UID: \"de8d0e19-aece-4044-9eb8-ede1e5edda45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9vxl" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652932 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652980 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c8fec47c-0cfe-45b1-9f45-5eba4c924359-csi-data-dir\") pod \"csi-hostpathplugin-w5bmh\" (UID: \"c8fec47c-0cfe-45b1-9f45-5eba4c924359\") " pod="hostpath-provisioner/csi-hostpathplugin-w5bmh" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.652998 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c8fec47c-0cfe-45b1-9f45-5eba4c924359-plugins-dir\") pod \"csi-hostpathplugin-w5bmh\" (UID: \"c8fec47c-0cfe-45b1-9f45-5eba4c924359\") " pod="hostpath-provisioner/csi-hostpathplugin-w5bmh" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.653018 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5069f4e4-270c-4fa2-9121-e6da86b389d1-etcd-service-ca\") pod \"etcd-operator-b45778765-qmqbr\" (UID: \"5069f4e4-270c-4fa2-9121-e6da86b389d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qmqbr" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.653036 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17750e75-596f-4637-a240-55aabb725a86-serving-cert\") pod \"service-ca-operator-777779d784-ckxzz\" (UID: \"17750e75-596f-4637-a240-55aabb725a86\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ckxzz" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.653056 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxsbj\" (UniqueName: \"kubernetes.io/projected/a58228cf-5189-4c26-b772-a1c2145873a0-kube-api-access-hxsbj\") pod \"dns-operator-744455d44c-svcq4\" (UID: \"a58228cf-5189-4c26-b772-a1c2145873a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-svcq4" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.653074 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdmfb\" (UniqueName: \"kubernetes.io/projected/7b7cfcc6-ee0e-4af5-9e03-5d8cbb40edbb-kube-api-access-qdmfb\") pod \"control-plane-machine-set-operator-78cbb6b69f-qvhvd\" (UID: \"7b7cfcc6-ee0e-4af5-9e03-5d8cbb40edbb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qvhvd" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.653114 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8251ea29-3180-4d6c-a6f7-6477bcd8ed6f-stats-auth\") pod \"router-default-5444994796-2djpb\" (UID: \"8251ea29-3180-4d6c-a6f7-6477bcd8ed6f\") " pod="openshift-ingress/router-default-5444994796-2djpb" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.653133 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eae91bc6-fbec-4bb5-81f7-254dc473427e-config-volume\") pod \"dns-default-4htrx\" (UID: \"eae91bc6-fbec-4bb5-81f7-254dc473427e\") " pod="openshift-dns/dns-default-4htrx" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.653152 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/05c3d993-bbb4-4f67-8952-23d9b107b889-tmpfs\") pod \"packageserver-d55dfcdfc-4jlxs\" (UID: \"05c3d993-bbb4-4f67-8952-23d9b107b889\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jlxs" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.653171 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scbt4\" (UniqueName: \"kubernetes.io/projected/82e59745-0a0a-4c7c-a61b-d801aed4d11d-kube-api-access-scbt4\") pod \"kube-storage-version-migrator-operator-b67b599dd-qqvvd\" (UID: \"82e59745-0a0a-4c7c-a61b-d801aed4d11d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qqvvd" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.653188 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a20501a9-7a2f-46ba-8322-a9b38d14bb4a-srv-cert\") pod \"catalog-operator-68c6474976-4wmxp\" (UID: \"a20501a9-7a2f-46ba-8322-a9b38d14bb4a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wmxp" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.653213 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8251ea29-3180-4d6c-a6f7-6477bcd8ed6f-default-certificate\") pod \"router-default-5444994796-2djpb\" (UID: \"8251ea29-3180-4d6c-a6f7-6477bcd8ed6f\") " pod="openshift-ingress/router-default-5444994796-2djpb" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.653231 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8f1fb91a-4e37-4bad-baa4-4996c7dd06e8-signing-cabundle\") pod \"service-ca-9c57cc56f-ds2cw\" (UID: \"8f1fb91a-4e37-4bad-baa4-4996c7dd06e8\") " pod="openshift-service-ca/service-ca-9c57cc56f-ds2cw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.653247 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d5a2bf75-6d4d-40e9-a4d2-0aa192d25cc8-certs\") pod \"machine-config-server-mghs5\" (UID: \"d5a2bf75-6d4d-40e9-a4d2-0aa192d25cc8\") " pod="openshift-machine-config-operator/machine-config-server-mghs5" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.653266 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5069f4e4-270c-4fa2-9121-e6da86b389d1-etcd-ca\") pod \"etcd-operator-b45778765-qmqbr\" (UID: \"5069f4e4-270c-4fa2-9121-e6da86b389d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qmqbr" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.653288 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17750e75-596f-4637-a240-55aabb725a86-config\") pod \"service-ca-operator-777779d784-ckxzz\" (UID: \"17750e75-596f-4637-a240-55aabb725a86\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ckxzz" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.653306 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82e59745-0a0a-4c7c-a61b-d801aed4d11d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-qqvvd\" (UID: \"82e59745-0a0a-4c7c-a61b-d801aed4d11d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qqvvd" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.653331 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0cd8fc51-deec-410b-b2bb-4818c2f71230-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ws6qt\" (UID: \"0cd8fc51-deec-410b-b2bb-4818c2f71230\") " pod="openshift-marketplace/marketplace-operator-79b997595-ws6qt" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.653349 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7848p\" (UniqueName: \"kubernetes.io/projected/17750e75-596f-4637-a240-55aabb725a86-kube-api-access-7848p\") pod \"service-ca-operator-777779d784-ckxzz\" (UID: \"17750e75-596f-4637-a240-55aabb725a86\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ckxzz" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.653365 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0de1aed3-e393-4d4f-b201-12142736c664-srv-cert\") pod \"olm-operator-6b444d44fb-mrqvs\" (UID: \"0de1aed3-e393-4d4f-b201-12142736c664\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrqvs" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.653380 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3be50f08-5e27-434b-8862-52c075569d6d-cert\") pod \"ingress-canary-6l9sd\" (UID: \"3be50f08-5e27-434b-8862-52c075569d6d\") " pod="openshift-ingress-canary/ingress-canary-6l9sd" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.653397 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc5ql\" (UniqueName: \"kubernetes.io/projected/a20501a9-7a2f-46ba-8322-a9b38d14bb4a-kube-api-access-pc5ql\") pod \"catalog-operator-68c6474976-4wmxp\" (UID: \"a20501a9-7a2f-46ba-8322-a9b38d14bb4a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wmxp" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.653417 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7r2n\" (UniqueName: \"kubernetes.io/projected/8f1fb91a-4e37-4bad-baa4-4996c7dd06e8-kube-api-access-l7r2n\") pod \"service-ca-9c57cc56f-ds2cw\" (UID: \"8f1fb91a-4e37-4bad-baa4-4996c7dd06e8\") " pod="openshift-service-ca/service-ca-9c57cc56f-ds2cw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.653436 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbjd2\" (UniqueName: \"kubernetes.io/projected/d5a2bf75-6d4d-40e9-a4d2-0aa192d25cc8-kube-api-access-cbjd2\") pod \"machine-config-server-mghs5\" (UID: \"d5a2bf75-6d4d-40e9-a4d2-0aa192d25cc8\") " pod="openshift-machine-config-operator/machine-config-server-mghs5" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.653452 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0de1aed3-e393-4d4f-b201-12142736c664-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mrqvs\" (UID: \"0de1aed3-e393-4d4f-b201-12142736c664\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrqvs" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.655395 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/05c3d993-bbb4-4f67-8952-23d9b107b889-tmpfs\") pod \"packageserver-d55dfcdfc-4jlxs\" (UID: \"05c3d993-bbb4-4f67-8952-23d9b107b889\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jlxs" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.656440 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/de8d0e19-aece-4044-9eb8-ede1e5edda45-images\") pod \"machine-config-operator-74547568cd-d9vxl\" (UID: \"de8d0e19-aece-4044-9eb8-ede1e5edda45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9vxl" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.656464 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8f1fb91a-4e37-4bad-baa4-4996c7dd06e8-signing-key\") pod \"service-ca-9c57cc56f-ds2cw\" (UID: \"8f1fb91a-4e37-4bad-baa4-4996c7dd06e8\") " pod="openshift-service-ca/service-ca-9c57cc56f-ds2cw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.656486 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppnph\" (UniqueName: \"kubernetes.io/projected/c8fec47c-0cfe-45b1-9f45-5eba4c924359-kube-api-access-ppnph\") pod \"csi-hostpathplugin-w5bmh\" (UID: \"c8fec47c-0cfe-45b1-9f45-5eba4c924359\") " pod="hostpath-provisioner/csi-hostpathplugin-w5bmh" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.656503 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwnhv\" (UniqueName: \"kubernetes.io/projected/36efc3c0-8de6-423d-bb0e-c76488f53955-kube-api-access-qwnhv\") pod \"machine-config-controller-84d6567774-dqz9r\" (UID: \"36efc3c0-8de6-423d-bb0e-c76488f53955\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dqz9r" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.657269 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.657488 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c8fec47c-0cfe-45b1-9f45-5eba4c924359-mountpoint-dir\") pod \"csi-hostpathplugin-w5bmh\" (UID: \"c8fec47c-0cfe-45b1-9f45-5eba4c924359\") " pod="hostpath-provisioner/csi-hostpathplugin-w5bmh" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.658225 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/36efc3c0-8de6-423d-bb0e-c76488f53955-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dqz9r\" (UID: \"36efc3c0-8de6-423d-bb0e-c76488f53955\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dqz9r" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.658249 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82e59745-0a0a-4c7c-a61b-d801aed4d11d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-qqvvd\" (UID: \"82e59745-0a0a-4c7c-a61b-d801aed4d11d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qqvvd" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.658270 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5069f4e4-270c-4fa2-9121-e6da86b389d1-serving-cert\") pod \"etcd-operator-b45778765-qmqbr\" (UID: \"5069f4e4-270c-4fa2-9121-e6da86b389d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qmqbr" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.658289 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eae91bc6-fbec-4bb5-81f7-254dc473427e-metrics-tls\") pod \"dns-default-4htrx\" (UID: \"eae91bc6-fbec-4bb5-81f7-254dc473427e\") " pod="openshift-dns/dns-default-4htrx" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.658305 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a58228cf-5189-4c26-b772-a1c2145873a0-metrics-tls\") pod \"dns-operator-744455d44c-svcq4\" (UID: \"a58228cf-5189-4c26-b772-a1c2145873a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-svcq4" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.658332 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8lfx\" (UniqueName: \"kubernetes.io/projected/3be50f08-5e27-434b-8862-52c075569d6d-kube-api-access-d8lfx\") pod \"ingress-canary-6l9sd\" (UID: \"3be50f08-5e27-434b-8862-52c075569d6d\") " pod="openshift-ingress-canary/ingress-canary-6l9sd" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.658349 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb389c3f-69cc-49bd-b413-3fbbf370ad41-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zwcnb\" (UID: \"eb389c3f-69cc-49bd-b413-3fbbf370ad41\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zwcnb" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.658383 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8251ea29-3180-4d6c-a6f7-6477bcd8ed6f-metrics-certs\") pod \"router-default-5444994796-2djpb\" (UID: \"8251ea29-3180-4d6c-a6f7-6477bcd8ed6f\") " pod="openshift-ingress/router-default-5444994796-2djpb" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.658400 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5069f4e4-270c-4fa2-9121-e6da86b389d1-config\") pod \"etcd-operator-b45778765-qmqbr\" (UID: \"5069f4e4-270c-4fa2-9121-e6da86b389d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qmqbr" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.658418 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgvw5\" (UniqueName: \"kubernetes.io/projected/eae91bc6-fbec-4bb5-81f7-254dc473427e-kube-api-access-zgvw5\") pod \"dns-default-4htrx\" (UID: \"eae91bc6-fbec-4bb5-81f7-254dc473427e\") " pod="openshift-dns/dns-default-4htrx" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.658435 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d5a2bf75-6d4d-40e9-a4d2-0aa192d25cc8-node-bootstrap-token\") pod \"machine-config-server-mghs5\" (UID: \"d5a2bf75-6d4d-40e9-a4d2-0aa192d25cc8\") " pod="openshift-machine-config-operator/machine-config-server-mghs5" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.658457 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24632d76-79cd-400b-bfad-a4c8a0ffbb68-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9lr2g\" (UID: \"24632d76-79cd-400b-bfad-a4c8a0ffbb68\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lr2g" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.658484 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m98bg\" (UniqueName: \"kubernetes.io/projected/0cd8fc51-deec-410b-b2bb-4818c2f71230-kube-api-access-m98bg\") pod \"marketplace-operator-79b997595-ws6qt\" (UID: \"0cd8fc51-deec-410b-b2bb-4818c2f71230\") " pod="openshift-marketplace/marketplace-operator-79b997595-ws6qt" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.658500 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/de8d0e19-aece-4044-9eb8-ede1e5edda45-auth-proxy-config\") pod \"machine-config-operator-74547568cd-d9vxl\" (UID: \"de8d0e19-aece-4044-9eb8-ede1e5edda45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9vxl" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.658519 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c8fec47c-0cfe-45b1-9f45-5eba4c924359-socket-dir\") pod \"csi-hostpathplugin-w5bmh\" (UID: \"c8fec47c-0cfe-45b1-9f45-5eba4c924359\") " pod="hostpath-provisioner/csi-hostpathplugin-w5bmh" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.658564 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/05c3d993-bbb4-4f67-8952-23d9b107b889-apiservice-cert\") pod \"packageserver-d55dfcdfc-4jlxs\" (UID: \"05c3d993-bbb4-4f67-8952-23d9b107b889\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jlxs" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.659466 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8251ea29-3180-4d6c-a6f7-6477bcd8ed6f-service-ca-bundle\") pod \"router-default-5444994796-2djpb\" (UID: \"8251ea29-3180-4d6c-a6f7-6477bcd8ed6f\") " pod="openshift-ingress/router-default-5444994796-2djpb" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.660339 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17750e75-596f-4637-a240-55aabb725a86-config\") pod \"service-ca-operator-777779d784-ckxzz\" (UID: \"17750e75-596f-4637-a240-55aabb725a86\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ckxzz" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.663501 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a20501a9-7a2f-46ba-8322-a9b38d14bb4a-srv-cert\") pod \"catalog-operator-68c6474976-4wmxp\" (UID: \"a20501a9-7a2f-46ba-8322-a9b38d14bb4a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wmxp" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.664004 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c8fec47c-0cfe-45b1-9f45-5eba4c924359-plugins-dir\") pod \"csi-hostpathplugin-w5bmh\" (UID: \"c8fec47c-0cfe-45b1-9f45-5eba4c924359\") " pod="hostpath-provisioner/csi-hostpathplugin-w5bmh" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.664851 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a20501a9-7a2f-46ba-8322-a9b38d14bb4a-profile-collector-cert\") pod \"catalog-operator-68c6474976-4wmxp\" (UID: \"a20501a9-7a2f-46ba-8322-a9b38d14bb4a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wmxp" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.664894 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5069f4e4-270c-4fa2-9121-e6da86b389d1-etcd-service-ca\") pod \"etcd-operator-b45778765-qmqbr\" (UID: \"5069f4e4-270c-4fa2-9121-e6da86b389d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qmqbr" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.665280 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7b7cfcc6-ee0e-4af5-9e03-5d8cbb40edbb-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qvhvd\" (UID: \"7b7cfcc6-ee0e-4af5-9e03-5d8cbb40edbb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qvhvd" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.665583 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5069f4e4-270c-4fa2-9121-e6da86b389d1-etcd-client\") pod \"etcd-operator-b45778765-qmqbr\" (UID: \"5069f4e4-270c-4fa2-9121-e6da86b389d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qmqbr" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.667009 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8251ea29-3180-4d6c-a6f7-6477bcd8ed6f-default-certificate\") pod \"router-default-5444994796-2djpb\" (UID: \"8251ea29-3180-4d6c-a6f7-6477bcd8ed6f\") " pod="openshift-ingress/router-default-5444994796-2djpb" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.667663 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0cd8fc51-deec-410b-b2bb-4818c2f71230-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ws6qt\" (UID: \"0cd8fc51-deec-410b-b2bb-4818c2f71230\") " pod="openshift-marketplace/marketplace-operator-79b997595-ws6qt" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.667902 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8f1fb91a-4e37-4bad-baa4-4996c7dd06e8-signing-cabundle\") pod \"service-ca-9c57cc56f-ds2cw\" (UID: \"8f1fb91a-4e37-4bad-baa4-4996c7dd06e8\") " pod="openshift-service-ca/service-ca-9c57cc56f-ds2cw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.668212 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82e59745-0a0a-4c7c-a61b-d801aed4d11d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-qqvvd\" (UID: \"82e59745-0a0a-4c7c-a61b-d801aed4d11d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qqvvd" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.668305 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c8fec47c-0cfe-45b1-9f45-5eba4c924359-csi-data-dir\") pod \"csi-hostpathplugin-w5bmh\" (UID: \"c8fec47c-0cfe-45b1-9f45-5eba4c924359\") " pod="hostpath-provisioner/csi-hostpathplugin-w5bmh" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.672915 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb389c3f-69cc-49bd-b413-3fbbf370ad41-config\") pod \"kube-controller-manager-operator-78b949d7b-zwcnb\" (UID: \"eb389c3f-69cc-49bd-b413-3fbbf370ad41\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zwcnb" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.676047 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/36efc3c0-8de6-423d-bb0e-c76488f53955-proxy-tls\") pod \"machine-config-controller-84d6567774-dqz9r\" (UID: \"36efc3c0-8de6-423d-bb0e-c76488f53955\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dqz9r" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.677583 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8251ea29-3180-4d6c-a6f7-6477bcd8ed6f-stats-auth\") pod \"router-default-5444994796-2djpb\" (UID: \"8251ea29-3180-4d6c-a6f7-6477bcd8ed6f\") " pod="openshift-ingress/router-default-5444994796-2djpb" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.677938 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c8fec47c-0cfe-45b1-9f45-5eba4c924359-registration-dir\") pod \"csi-hostpathplugin-w5bmh\" (UID: \"c8fec47c-0cfe-45b1-9f45-5eba4c924359\") " pod="hostpath-provisioner/csi-hostpathplugin-w5bmh" Dec 05 23:21:58 crc kubenswrapper[4734]: E1205 23:21:58.678205 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:21:59.178190646 +0000 UTC m=+139.861594922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.678800 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e08ddd6-cfa7-4e6b-902f-d789f91fd70b-trusted-ca\") pod \"ingress-operator-5b745b69d9-kkt7h\" (UID: \"9e08ddd6-cfa7-4e6b-902f-d789f91fd70b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kkt7h" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.679296 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5069f4e4-270c-4fa2-9121-e6da86b389d1-etcd-ca\") pod \"etcd-operator-b45778765-qmqbr\" (UID: \"5069f4e4-270c-4fa2-9121-e6da86b389d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qmqbr" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.679394 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5069f4e4-270c-4fa2-9121-e6da86b389d1-config\") pod \"etcd-operator-b45778765-qmqbr\" (UID: \"5069f4e4-270c-4fa2-9121-e6da86b389d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qmqbr" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.680468 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a20dbad-8352-4804-9c0e-a2b6108a0d1b-secret-volume\") pod \"collect-profiles-29416275-mfkvp\" (UID: \"2a20dbad-8352-4804-9c0e-a2b6108a0d1b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-mfkvp" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.681185 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24632d76-79cd-400b-bfad-a4c8a0ffbb68-config\") pod \"kube-apiserver-operator-766d6c64bb-9lr2g\" (UID: \"24632d76-79cd-400b-bfad-a4c8a0ffbb68\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lr2g" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.683202 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.683275 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de8d0e19-aece-4044-9eb8-ede1e5edda45-proxy-tls\") pod \"machine-config-operator-74547568cd-d9vxl\" (UID: \"de8d0e19-aece-4044-9eb8-ede1e5edda45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9vxl" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.683429 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/de8d0e19-aece-4044-9eb8-ede1e5edda45-auth-proxy-config\") pod \"machine-config-operator-74547568cd-d9vxl\" (UID: \"de8d0e19-aece-4044-9eb8-ede1e5edda45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9vxl" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.683752 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c8fec47c-0cfe-45b1-9f45-5eba4c924359-socket-dir\") pod \"csi-hostpathplugin-w5bmh\" (UID: \"c8fec47c-0cfe-45b1-9f45-5eba4c924359\") " pod="hostpath-provisioner/csi-hostpathplugin-w5bmh" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.683792 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/64b05eab-74bd-43bd-b206-54f4e784e581-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-m6c2b\" (UID: \"64b05eab-74bd-43bd-b206-54f4e784e581\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m6c2b" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.683988 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eae91bc6-fbec-4bb5-81f7-254dc473427e-config-volume\") pod \"dns-default-4htrx\" (UID: \"eae91bc6-fbec-4bb5-81f7-254dc473427e\") " pod="openshift-dns/dns-default-4htrx" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.685050 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/36efc3c0-8de6-423d-bb0e-c76488f53955-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dqz9r\" (UID: \"36efc3c0-8de6-423d-bb0e-c76488f53955\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dqz9r" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.688825 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24632d76-79cd-400b-bfad-a4c8a0ffbb68-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9lr2g\" (UID: \"24632d76-79cd-400b-bfad-a4c8a0ffbb68\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lr2g" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.701871 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/05c3d993-bbb4-4f67-8952-23d9b107b889-webhook-cert\") pod \"packageserver-d55dfcdfc-4jlxs\" (UID: \"05c3d993-bbb4-4f67-8952-23d9b107b889\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jlxs" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.702406 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8251ea29-3180-4d6c-a6f7-6477bcd8ed6f-metrics-certs\") pod \"router-default-5444994796-2djpb\" (UID: \"8251ea29-3180-4d6c-a6f7-6477bcd8ed6f\") " pod="openshift-ingress/router-default-5444994796-2djpb" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.702917 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d5a2bf75-6d4d-40e9-a4d2-0aa192d25cc8-certs\") pod \"machine-config-server-mghs5\" (UID: \"d5a2bf75-6d4d-40e9-a4d2-0aa192d25cc8\") " pod="openshift-machine-config-operator/machine-config-server-mghs5" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.703286 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0cd8fc51-deec-410b-b2bb-4818c2f71230-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ws6qt\" (UID: \"0cd8fc51-deec-410b-b2bb-4818c2f71230\") " pod="openshift-marketplace/marketplace-operator-79b997595-ws6qt" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.703429 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17750e75-596f-4637-a240-55aabb725a86-serving-cert\") pod \"service-ca-operator-777779d784-ckxzz\" (UID: \"17750e75-596f-4637-a240-55aabb725a86\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ckxzz" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.704046 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb389c3f-69cc-49bd-b413-3fbbf370ad41-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zwcnb\" (UID: \"eb389c3f-69cc-49bd-b413-3fbbf370ad41\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zwcnb" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.704117 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0de1aed3-e393-4d4f-b201-12142736c664-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mrqvs\" (UID: \"0de1aed3-e393-4d4f-b201-12142736c664\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrqvs" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.704785 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5069f4e4-270c-4fa2-9121-e6da86b389d1-serving-cert\") pod \"etcd-operator-b45778765-qmqbr\" (UID: \"5069f4e4-270c-4fa2-9121-e6da86b389d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qmqbr" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.705284 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8f1fb91a-4e37-4bad-baa4-4996c7dd06e8-signing-key\") pod \"service-ca-9c57cc56f-ds2cw\" (UID: \"8f1fb91a-4e37-4bad-baa4-4996c7dd06e8\") " pod="openshift-service-ca/service-ca-9c57cc56f-ds2cw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.705368 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3be50f08-5e27-434b-8862-52c075569d6d-cert\") pod \"ingress-canary-6l9sd\" (UID: \"3be50f08-5e27-434b-8862-52c075569d6d\") " pod="openshift-ingress-canary/ingress-canary-6l9sd" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.705629 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9e08ddd6-cfa7-4e6b-902f-d789f91fd70b-metrics-tls\") pod \"ingress-operator-5b745b69d9-kkt7h\" (UID: \"9e08ddd6-cfa7-4e6b-902f-d789f91fd70b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kkt7h" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.705675 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2e6d7ac5-5f33-4561-a79b-685f9ae74144-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tsb97\" (UID: \"2e6d7ac5-5f33-4561-a79b-685f9ae74144\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tsb97" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.705828 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/05c3d993-bbb4-4f67-8952-23d9b107b889-apiservice-cert\") pod \"packageserver-d55dfcdfc-4jlxs\" (UID: \"05c3d993-bbb4-4f67-8952-23d9b107b889\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jlxs" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.706170 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0de1aed3-e393-4d4f-b201-12142736c664-srv-cert\") pod \"olm-operator-6b444d44fb-mrqvs\" (UID: \"0de1aed3-e393-4d4f-b201-12142736c664\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrqvs" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.707082 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eae91bc6-fbec-4bb5-81f7-254dc473427e-metrics-tls\") pod \"dns-default-4htrx\" (UID: \"eae91bc6-fbec-4bb5-81f7-254dc473427e\") " pod="openshift-dns/dns-default-4htrx" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.707490 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d5a2bf75-6d4d-40e9-a4d2-0aa192d25cc8-node-bootstrap-token\") pod \"machine-config-server-mghs5\" (UID: \"d5a2bf75-6d4d-40e9-a4d2-0aa192d25cc8\") " pod="openshift-machine-config-operator/machine-config-server-mghs5" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.709859 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.734133 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.735370 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a58228cf-5189-4c26-b772-a1c2145873a0-metrics-tls\") pod \"dns-operator-744455d44c-svcq4\" (UID: \"a58228cf-5189-4c26-b772-a1c2145873a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-svcq4" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.754772 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgtxb\" (UniqueName: \"kubernetes.io/projected/f4f948a0-bcd5-4e9e-86ec-0429082dac44-kube-api-access-mgtxb\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.759356 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:21:58 crc kubenswrapper[4734]: E1205 23:21:58.760055 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:21:59.260030977 +0000 UTC m=+139.943435243 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.777369 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82e59745-0a0a-4c7c-a61b-d801aed4d11d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-qqvvd\" (UID: \"82e59745-0a0a-4c7c-a61b-d801aed4d11d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qqvvd" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.779622 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/de8d0e19-aece-4044-9eb8-ede1e5edda45-images\") pod \"machine-config-operator-74547568cd-d9vxl\" (UID: \"de8d0e19-aece-4044-9eb8-ede1e5edda45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9vxl" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.779817 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a20dbad-8352-4804-9c0e-a2b6108a0d1b-config-volume\") pod \"collect-profiles-29416275-mfkvp\" (UID: \"2a20dbad-8352-4804-9c0e-a2b6108a0d1b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-mfkvp" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.793101 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4f948a0-bcd5-4e9e-86ec-0429082dac44-bound-sa-token\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.794355 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt4xx\" (UniqueName: \"kubernetes.io/projected/0663418a-98b9-48b4-869e-257a7bddd32e-kube-api-access-bt4xx\") pod \"openshift-controller-manager-operator-756b6f6bc6-k9fv6\" (UID: \"0663418a-98b9-48b4-869e-257a7bddd32e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k9fv6" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.818856 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gkww2"] Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.836643 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxsbj\" (UniqueName: \"kubernetes.io/projected/a58228cf-5189-4c26-b772-a1c2145873a0-kube-api-access-hxsbj\") pod \"dns-operator-744455d44c-svcq4\" (UID: \"a58228cf-5189-4c26-b772-a1c2145873a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-svcq4" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.856497 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl2rb\" (UniqueName: \"kubernetes.io/projected/5069f4e4-270c-4fa2-9121-e6da86b389d1-kube-api-access-sl2rb\") pod \"etcd-operator-b45778765-qmqbr\" (UID: \"5069f4e4-270c-4fa2-9121-e6da86b389d1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qmqbr" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.860879 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k9fv6" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.861958 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:58 crc kubenswrapper[4734]: E1205 23:21:58.862839 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:21:59.362808682 +0000 UTC m=+140.046212958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.877369 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scbt4\" (UniqueName: \"kubernetes.io/projected/82e59745-0a0a-4c7c-a61b-d801aed4d11d-kube-api-access-scbt4\") pod \"kube-storage-version-migrator-operator-b67b599dd-qqvvd\" (UID: \"82e59745-0a0a-4c7c-a61b-d801aed4d11d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qqvvd" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.907766 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clcxc\" (UniqueName: \"kubernetes.io/projected/64b05eab-74bd-43bd-b206-54f4e784e581-kube-api-access-clcxc\") pod \"multus-admission-controller-857f4d67dd-m6c2b\" (UID: \"64b05eab-74bd-43bd-b206-54f4e784e581\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m6c2b" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.922834 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-svcq4" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.929503 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-m6c2b" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.936593 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dnsc\" (UniqueName: \"kubernetes.io/projected/eb3de870-5133-4874-bc83-37be5f299296-kube-api-access-8dnsc\") pod \"migrator-59844c95c7-n9fhz\" (UID: \"eb3de870-5133-4874-bc83-37be5f299296\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-n9fhz" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.937481 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd8ws\" (UniqueName: \"kubernetes.io/projected/2a20dbad-8352-4804-9c0e-a2b6108a0d1b-kube-api-access-gd8ws\") pod \"collect-profiles-29416275-mfkvp\" (UID: \"2a20dbad-8352-4804-9c0e-a2b6108a0d1b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-mfkvp" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.938550 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qmqbr" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.958119 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qqvvd" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.964780 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:21:58 crc kubenswrapper[4734]: E1205 23:21:58.965387 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:21:59.465367592 +0000 UTC m=+140.148771868 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.977932 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e08ddd6-cfa7-4e6b-902f-d789f91fd70b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kkt7h\" (UID: \"9e08ddd6-cfa7-4e6b-902f-d789f91fd70b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kkt7h" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.978595 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbrws\" (UniqueName: \"kubernetes.io/projected/8251ea29-3180-4d6c-a6f7-6477bcd8ed6f-kube-api-access-rbrws\") pod \"router-default-5444994796-2djpb\" (UID: \"8251ea29-3180-4d6c-a6f7-6477bcd8ed6f\") " pod="openshift-ingress/router-default-5444994796-2djpb" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.994188 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-n9fhz" Dec 05 23:21:58 crc kubenswrapper[4734]: I1205 23:21:58.995391 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6np8v\" (UniqueName: \"kubernetes.io/projected/9e08ddd6-cfa7-4e6b-902f-d789f91fd70b-kube-api-access-6np8v\") pod \"ingress-operator-5b745b69d9-kkt7h\" (UID: \"9e08ddd6-cfa7-4e6b-902f-d789f91fd70b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kkt7h" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.031742 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szzhz\" (UniqueName: \"kubernetes.io/projected/0de1aed3-e393-4d4f-b201-12142736c664-kube-api-access-szzhz\") pod \"olm-operator-6b444d44fb-mrqvs\" (UID: \"0de1aed3-e393-4d4f-b201-12142736c664\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrqvs" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.051499 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-mfkvp" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.068953 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:59 crc kubenswrapper[4734]: E1205 23:21:59.069402 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:21:59.569386657 +0000 UTC m=+140.252790933 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.072389 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdmfb\" (UniqueName: \"kubernetes.io/projected/7b7cfcc6-ee0e-4af5-9e03-5d8cbb40edbb-kube-api-access-qdmfb\") pod \"control-plane-machine-set-operator-78cbb6b69f-qvhvd\" (UID: \"7b7cfcc6-ee0e-4af5-9e03-5d8cbb40edbb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qvhvd" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.075551 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7848p\" (UniqueName: \"kubernetes.io/projected/17750e75-596f-4637-a240-55aabb725a86-kube-api-access-7848p\") pod \"service-ca-operator-777779d784-ckxzz\" (UID: \"17750e75-596f-4637-a240-55aabb725a86\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ckxzz" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.082803 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7r2n\" (UniqueName: \"kubernetes.io/projected/8f1fb91a-4e37-4bad-baa4-4996c7dd06e8-kube-api-access-l7r2n\") pod \"service-ca-9c57cc56f-ds2cw\" (UID: \"8f1fb91a-4e37-4bad-baa4-4996c7dd06e8\") " pod="openshift-service-ca/service-ca-9c57cc56f-ds2cw" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.102343 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc5ql\" (UniqueName: \"kubernetes.io/projected/a20501a9-7a2f-46ba-8322-a9b38d14bb4a-kube-api-access-pc5ql\") pod \"catalog-operator-68c6474976-4wmxp\" (UID: \"a20501a9-7a2f-46ba-8322-a9b38d14bb4a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wmxp" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.120272 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fngsb\" (UniqueName: \"kubernetes.io/projected/2e6d7ac5-5f33-4561-a79b-685f9ae74144-kube-api-access-fngsb\") pod \"package-server-manager-789f6589d5-tsb97\" (UID: \"2e6d7ac5-5f33-4561-a79b-685f9ae74144\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tsb97" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.141482 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92mnn\" (UniqueName: \"kubernetes.io/projected/de8d0e19-aece-4044-9eb8-ede1e5edda45-kube-api-access-92mnn\") pod \"machine-config-operator-74547568cd-d9vxl\" (UID: \"de8d0e19-aece-4044-9eb8-ede1e5edda45\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9vxl" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.161860 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgvw5\" (UniqueName: \"kubernetes.io/projected/eae91bc6-fbec-4bb5-81f7-254dc473427e-kube-api-access-zgvw5\") pod \"dns-default-4htrx\" (UID: \"eae91bc6-fbec-4bb5-81f7-254dc473427e\") " pod="openshift-dns/dns-default-4htrx" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.170513 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:21:59 crc kubenswrapper[4734]: E1205 23:21:59.171119 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:21:59.671096237 +0000 UTC m=+140.354500513 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.199414 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb389c3f-69cc-49bd-b413-3fbbf370ad41-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zwcnb\" (UID: \"eb389c3f-69cc-49bd-b413-3fbbf370ad41\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zwcnb" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.202563 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8lfx\" (UniqueName: \"kubernetes.io/projected/3be50f08-5e27-434b-8862-52c075569d6d-kube-api-access-d8lfx\") pod \"ingress-canary-6l9sd\" (UID: \"3be50f08-5e27-434b-8862-52c075569d6d\") " pod="openshift-ingress-canary/ingress-canary-6l9sd" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.212953 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zwcnb" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.220787 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbjd2\" (UniqueName: \"kubernetes.io/projected/d5a2bf75-6d4d-40e9-a4d2-0aa192d25cc8-kube-api-access-cbjd2\") pod \"machine-config-server-mghs5\" (UID: \"d5a2bf75-6d4d-40e9-a4d2-0aa192d25cc8\") " pod="openshift-machine-config-operator/machine-config-server-mghs5" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.233293 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-htwjw"] Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.247245 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kkt7h" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.253010 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m98bg\" (UniqueName: \"kubernetes.io/projected/0cd8fc51-deec-410b-b2bb-4818c2f71230-kube-api-access-m98bg\") pod \"marketplace-operator-79b997595-ws6qt\" (UID: \"0cd8fc51-deec-410b-b2bb-4818c2f71230\") " pod="openshift-marketplace/marketplace-operator-79b997595-ws6qt" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.268279 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-2djpb" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.278390 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:59 crc kubenswrapper[4734]: E1205 23:21:59.278796 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:21:59.778783313 +0000 UTC m=+140.462187589 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.279116 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24632d76-79cd-400b-bfad-a4c8a0ffbb68-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9lr2g\" (UID: \"24632d76-79cd-400b-bfad-a4c8a0ffbb68\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lr2g" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.279569 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wmxp" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.282901 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lr2g" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.285928 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9vxl" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.302232 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2c4t\" (UniqueName: \"kubernetes.io/projected/05c3d993-bbb4-4f67-8952-23d9b107b889-kube-api-access-n2c4t\") pod \"packageserver-d55dfcdfc-4jlxs\" (UID: \"05c3d993-bbb4-4f67-8952-23d9b107b889\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jlxs" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.307213 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ws6qt" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.314564 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppnph\" (UniqueName: \"kubernetes.io/projected/c8fec47c-0cfe-45b1-9f45-5eba4c924359-kube-api-access-ppnph\") pod \"csi-hostpathplugin-w5bmh\" (UID: \"c8fec47c-0cfe-45b1-9f45-5eba4c924359\") " pod="hostpath-provisioner/csi-hostpathplugin-w5bmh" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.323209 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrqvs" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.324465 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwnhv\" (UniqueName: \"kubernetes.io/projected/36efc3c0-8de6-423d-bb0e-c76488f53955-kube-api-access-qwnhv\") pod \"machine-config-controller-84d6567774-dqz9r\" (UID: \"36efc3c0-8de6-423d-bb0e-c76488f53955\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dqz9r" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.329056 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qvhvd" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.338774 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ckxzz" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.345471 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-ds2cw" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.360478 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tsb97" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.377443 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-w5bmh" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.380183 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:21:59 crc kubenswrapper[4734]: E1205 23:21:59.380679 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:21:59.880660886 +0000 UTC m=+140.564065152 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.402905 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-mghs5" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.403360 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4htrx" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.405413 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6l9sd" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.443908 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gxdpj"] Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.444002 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k9rd"] Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.486814 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.487182 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-2djpb" event={"ID":"8251ea29-3180-4d6c-a6f7-6477bcd8ed6f","Type":"ContainerStarted","Data":"f75b7651de255ff9ceba55c5009c2fa09a7061290e2b35fa6303c9d92d7c1be2"} Dec 05 23:21:59 crc kubenswrapper[4734]: E1205 23:21:59.487236 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:21:59.987218294 +0000 UTC m=+140.670622570 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.500541 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-htwjw" event={"ID":"d391f1fa-9bbe-478c-a1da-6ccb8f75f3c5","Type":"ContainerStarted","Data":"ef784313976bb5c58b37e297d3d45980c664207a19bbb21c7c8a02e35ceab8b1"} Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.532926 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-gkww2" event={"ID":"30f5e37c-8b52-4347-bc5e-a973ca06a7bf","Type":"ContainerStarted","Data":"bbbd9e113370ca3788eef996f224a3f6769fa7281d0add71a3b6aaf146d53b61"} Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.532981 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-gkww2" event={"ID":"30f5e37c-8b52-4347-bc5e-a973ca06a7bf","Type":"ContainerStarted","Data":"c1badcb197e8eba5df318080a451bfc587bb0465fa395f190a6d74485377eaaa"} Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.533795 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-gkww2" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.533839 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wl2fs"] Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.537181 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-txdvl"] Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.537249 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-5h9wr"] Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.563052 4734 patch_prober.go:28] interesting pod/console-operator-58897d9998-gkww2 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.563378 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-gkww2" podUID="30f5e37c-8b52-4347-bc5e-a973ca06a7bf" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.583932 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsgsc" event={"ID":"4520b844-1a95-4300-8a10-5ef68e2067cb","Type":"ContainerStarted","Data":"96e83d35e148324dd6c67756858b42af220ce184ca88b4a78a06f311b3a839df"} Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.584001 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsgsc" event={"ID":"4520b844-1a95-4300-8a10-5ef68e2067cb","Type":"ContainerStarted","Data":"02a08083e840f0d1dc271f3fd4b9f122bb26185f3cdc106fb5628718b18566df"} Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.584018 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsgsc" event={"ID":"4520b844-1a95-4300-8a10-5ef68e2067cb","Type":"ContainerStarted","Data":"07d6236b64c42dd01837a0b63c84ca01d629567aa11bfb493658c1b1f21e1cbb"} Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.587734 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:21:59 crc kubenswrapper[4734]: E1205 23:21:59.590060 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:00.09003987 +0000 UTC m=+140.773444146 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.603811 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jlxs" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.615986 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dqz9r" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.617583 4734 generic.go:334] "Generic (PLEG): container finished" podID="07e87699-6af3-4f68-a3c8-85780433774b" containerID="dc3b9b3c3fda281b4ed9a1d8b194a9c687c0cb0325dfbc6885a6eea31b862ad0" exitCode=0 Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.696983 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:59 crc kubenswrapper[4734]: E1205 23:21:59.699501 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:00.199474679 +0000 UTC m=+140.882878955 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.811418 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:21:59 crc kubenswrapper[4734]: E1205 23:21:59.812782 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:00.312753022 +0000 UTC m=+140.996157298 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.818975 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-fsrfk" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.819068 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-x67qn"] Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.819144 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz" Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.819203 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2dx28"] Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.819282 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-n9fhz"] Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.819342 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" event={"ID":"07e87699-6af3-4f68-a3c8-85780433774b","Type":"ContainerDied","Data":"dc3b9b3c3fda281b4ed9a1d8b194a9c687c0cb0325dfbc6885a6eea31b862ad0"} Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.819411 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-776fp" event={"ID":"86bfea27-2a17-463f-9768-201c49599d74","Type":"ContainerStarted","Data":"098ceead91702beb8e995d77184723d9690688c3789731921689002e0809381f"} Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.819473 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44nn9" event={"ID":"448e552f-8a25-469f-b959-4fbe91ae9035","Type":"ContainerStarted","Data":"df0e2ba659c3ea50efa136ed24cf9dc1198e919b5816faa033edc686584568d3"} Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.819570 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44nn9" event={"ID":"448e552f-8a25-469f-b959-4fbe91ae9035","Type":"ContainerStarted","Data":"529cbb5d3a7af5792eae4113bd6e901da67805ba6ca43a1ddf6295a3f6bec735"} Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.819636 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-j6jsf" event={"ID":"74a8397f-0607-4761-9fc5-77e9a6d197c8","Type":"ContainerStarted","Data":"4086fe2c3bcdbdc0310f1aece07a271a9d1827404723f31bbfbc4405bbd80c4f"} Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.825504 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-svcq4"] Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.828845 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k9fv6"] Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.840475 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-m6c2b"] Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.882384 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qmqbr"] Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.884638 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qqvvd"] Dec 05 23:21:59 crc kubenswrapper[4734]: I1205 23:21:59.917998 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:21:59 crc kubenswrapper[4734]: E1205 23:21:59.918314 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:00.418290466 +0000 UTC m=+141.101694742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:00 crc kubenswrapper[4734]: I1205 23:22:00.020992 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:00 crc kubenswrapper[4734]: E1205 23:22:00.021780 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:00.521561634 +0000 UTC m=+141.204965910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:00 crc kubenswrapper[4734]: I1205 23:22:00.022910 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:00 crc kubenswrapper[4734]: E1205 23:22:00.023289 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:00.523268935 +0000 UTC m=+141.206673211 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:00 crc kubenswrapper[4734]: I1205 23:22:00.042895 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416275-mfkvp"] Dec 05 23:22:00 crc kubenswrapper[4734]: I1205 23:22:00.125471 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:00 crc kubenswrapper[4734]: E1205 23:22:00.126187 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:00.626165773 +0000 UTC m=+141.309570049 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:00 crc kubenswrapper[4734]: I1205 23:22:00.231388 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:00 crc kubenswrapper[4734]: E1205 23:22:00.231820 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:00.73180807 +0000 UTC m=+141.415212346 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:00 crc kubenswrapper[4734]: I1205 23:22:00.232648 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-gkww2" podStartSLOduration=120.232626799 podStartE2EDuration="2m0.232626799s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:00.232377713 +0000 UTC m=+140.915781989" watchObservedRunningTime="2025-12-05 23:22:00.232626799 +0000 UTC m=+140.916031075" Dec 05 23:22:00 crc kubenswrapper[4734]: I1205 23:22:00.321806 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-44nn9" podStartSLOduration=120.32178086 podStartE2EDuration="2m0.32178086s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:00.284763 +0000 UTC m=+140.968167276" watchObservedRunningTime="2025-12-05 23:22:00.32178086 +0000 UTC m=+141.005185136" Dec 05 23:22:00 crc kubenswrapper[4734]: I1205 23:22:00.332018 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:00 crc kubenswrapper[4734]: E1205 23:22:00.332713 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:00.832694328 +0000 UTC m=+141.516098604 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:00 crc kubenswrapper[4734]: I1205 23:22:00.434441 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:00 crc kubenswrapper[4734]: E1205 23:22:00.434847 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:00.934833698 +0000 UTC m=+141.618237974 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:00 crc kubenswrapper[4734]: W1205 23:22:00.466445 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5069f4e4_270c_4fa2_9121_e6da86b389d1.slice/crio-779d6f405193d3636ac92738bf4ebf3d3cd0a311da9115aed74d8fa89b1abaa6 WatchSource:0}: Error finding container 779d6f405193d3636ac92738bf4ebf3d3cd0a311da9115aed74d8fa89b1abaa6: Status 404 returned error can't find the container with id 779d6f405193d3636ac92738bf4ebf3d3cd0a311da9115aed74d8fa89b1abaa6 Dec 05 23:22:00 crc kubenswrapper[4734]: I1205 23:22:00.535569 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:00 crc kubenswrapper[4734]: E1205 23:22:00.536058 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:01.036039664 +0000 UTC m=+141.719443940 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:00 crc kubenswrapper[4734]: I1205 23:22:00.599510 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-j6jsf" podStartSLOduration=120.599459322 podStartE2EDuration="2m0.599459322s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:00.598693934 +0000 UTC m=+141.282098200" watchObservedRunningTime="2025-12-05 23:22:00.599459322 +0000 UTC m=+141.282863598" Dec 05 23:22:00 crc kubenswrapper[4734]: I1205 23:22:00.631259 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsgsc" podStartSLOduration=120.631237273 podStartE2EDuration="2m0.631237273s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:00.629489111 +0000 UTC m=+141.312893387" watchObservedRunningTime="2025-12-05 23:22:00.631237273 +0000 UTC m=+141.314641549" Dec 05 23:22:00 crc kubenswrapper[4734]: I1205 23:22:00.638122 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:00 crc kubenswrapper[4734]: E1205 23:22:00.638524 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:01.138507382 +0000 UTC m=+141.821911658 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:00 crc kubenswrapper[4734]: E1205 23:22:00.685482 4734 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod790e28b3_bfd6_40f2_8bd4_272fc91b9ffe.slice/crio-conmon-a4a73def955379da1e0b0bb1e059e2e98a293893f07530194f7dd08dad90854a.scope\": RecentStats: unable to find data in memory cache]" Dec 05 23:22:00 crc kubenswrapper[4734]: I1205 23:22:00.739292 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:00 crc kubenswrapper[4734]: E1205 23:22:00.739737 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:01.239716868 +0000 UTC m=+141.923121144 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:00 crc kubenswrapper[4734]: I1205 23:22:00.741217 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-776fp" podStartSLOduration=120.741191364 podStartE2EDuration="2m0.741191364s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:00.729099078 +0000 UTC m=+141.412503354" watchObservedRunningTime="2025-12-05 23:22:00.741191364 +0000 UTC m=+141.424595630" Dec 05 23:22:00 crc kubenswrapper[4734]: I1205 23:22:00.851168 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:00 crc kubenswrapper[4734]: E1205 23:22:00.852162 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:01.352147371 +0000 UTC m=+142.035551647 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:00 crc kubenswrapper[4734]: I1205 23:22:00.913057 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-2djpb" event={"ID":"8251ea29-3180-4d6c-a6f7-6477bcd8ed6f","Type":"ContainerStarted","Data":"6021e929c0c38bb8b5bb86eb9b59a3a98b6884da26d6a5481cebddd6dcc0c8c7"} Dec 05 23:22:00 crc kubenswrapper[4734]: I1205 23:22:00.956620 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:00 crc kubenswrapper[4734]: E1205 23:22:00.957367 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:01.457342916 +0000 UTC m=+142.140747192 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:00 crc kubenswrapper[4734]: I1205 23:22:00.975930 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qqvvd" event={"ID":"82e59745-0a0a-4c7c-a61b-d801aed4d11d","Type":"ContainerStarted","Data":"7a27722dec66695164f699ea42f8a78ec676611579a3a5b8635a507d3a5c6302"} Dec 05 23:22:00 crc kubenswrapper[4734]: I1205 23:22:00.977442 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qmqbr" event={"ID":"5069f4e4-270c-4fa2-9121-e6da86b389d1","Type":"ContainerStarted","Data":"779d6f405193d3636ac92738bf4ebf3d3cd0a311da9115aed74d8fa89b1abaa6"} Dec 05 23:22:00 crc kubenswrapper[4734]: I1205 23:22:00.978391 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-svcq4" event={"ID":"a58228cf-5189-4c26-b772-a1c2145873a0","Type":"ContainerStarted","Data":"a06f6feab4913f647d1ff0e989147a999335b932d4d0878f5dd4b45a0457c004"} Dec 05 23:22:00 crc kubenswrapper[4734]: I1205 23:22:00.987673 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-x67qn" event={"ID":"7adf273b-63fb-40fe-9d0d-fe467260565b","Type":"ContainerStarted","Data":"1304099481b05fe47bfd725cdf079ca6813687c4577d12911ae48934558821f2"} Dec 05 23:22:00 crc kubenswrapper[4734]: I1205 23:22:00.989652 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zwcnb"] Dec 05 23:22:00 crc kubenswrapper[4734]: I1205 23:22:00.998634 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k9rd" event={"ID":"547872c0-29db-43b6-a531-14610127080d","Type":"ContainerStarted","Data":"a12a6dd5edd1ad5f1c12c8ec10ee6300f9cc0009287da4f4f25c3e82166a1969"} Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.019670 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k9fv6" event={"ID":"0663418a-98b9-48b4-869e-257a7bddd32e","Type":"ContainerStarted","Data":"65742e381d84af215992d86ded76650373f1781acac672366a8e26ebd5009d4d"} Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.032919 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-mfkvp" event={"ID":"2a20dbad-8352-4804-9c0e-a2b6108a0d1b","Type":"ContainerStarted","Data":"14f868430dc14e33bceff0e87a41131a7790986ff959ff99aef89a3e5c3f1e73"} Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.036020 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-fsrfk" podStartSLOduration=121.035988448 podStartE2EDuration="2m1.035988448s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:01.000845625 +0000 UTC m=+141.684249911" watchObservedRunningTime="2025-12-05 23:22:01.035988448 +0000 UTC m=+141.719392724" Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.048247 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" event={"ID":"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87","Type":"ContainerStarted","Data":"d39df6290a6f17ad114964be904d291712beddea87d7322887a8f796551773fd"} Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.059580 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.061163 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-mghs5" event={"ID":"d5a2bf75-6d4d-40e9-a4d2-0aa192d25cc8","Type":"ContainerStarted","Data":"b462cba1e2a9136ba99a14c05def3a3a582d72fcaf5f283c1ce3195200ae809f"} Dec 05 23:22:01 crc kubenswrapper[4734]: E1205 23:22:01.061496 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:01.561475974 +0000 UTC m=+142.244880330 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.085427 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-n9fhz" event={"ID":"eb3de870-5133-4874-bc83-37be5f299296","Type":"ContainerStarted","Data":"5c7752cc87355d830f1d41ab4881081159135fbab8bfc091f08aabc3dc350282"} Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.171384 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:01 crc kubenswrapper[4734]: E1205 23:22:01.172179 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:01.672148624 +0000 UTC m=+142.355552900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.172595 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:01 crc kubenswrapper[4734]: E1205 23:22:01.172950 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:01.672943384 +0000 UTC m=+142.356347660 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.175281 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-txdvl" event={"ID":"776e53fa-bf9e-44c4-8f89-2f78059733a7","Type":"ContainerStarted","Data":"d90c1b709a0f14caad07ebfd6a1e5c505dc87592bcf3b384533821980184a7f0"} Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.210837 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-m6c2b" event={"ID":"64b05eab-74bd-43bd-b206-54f4e784e581","Type":"ContainerStarted","Data":"a4f68dbb7cd8707c503fcac167ebbc99d129ec1eb41d5c94069db110313be6e7"} Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.233904 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wl2fs" event={"ID":"99827392-eef9-4b43-ab05-d57f8bc8d3ef","Type":"ContainerStarted","Data":"ae10e9864cd4e973767bff5922715f2dee67df41fbe28cf512761faab7bd8fa7"} Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.266906 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2dx28" event={"ID":"a0152808-f0b7-4ce4-9bc1-6bc11e69bd7f","Type":"ContainerStarted","Data":"4408f3aa21e47ca5c5444bdd94100243d4cee2ba3bf9dc0f89391087baa735d5"} Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.279890 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.280809 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-2djpb" Dec 05 23:22:01 crc kubenswrapper[4734]: E1205 23:22:01.280916 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:01.780899006 +0000 UTC m=+142.464303282 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.304658 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5h9wr" event={"ID":"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe","Type":"ContainerStarted","Data":"396ccce6799b68b38f3167cbd02e540c50f7f54cca577b1a2f46a61791f58ef7"} Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.334399 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-d9vxl"] Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.359514 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ws6qt"] Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.360783 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-gkww2" Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.394272 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:01 crc kubenswrapper[4734]: E1205 23:22:01.400102 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:01.900079534 +0000 UTC m=+142.583483810 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.460134 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kkt7h"] Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.495104 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:01 crc kubenswrapper[4734]: E1205 23:22:01.496435 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:01.996414672 +0000 UTC m=+142.679818948 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.504151 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wmxp"] Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.506846 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz" podStartSLOduration=121.506834097 podStartE2EDuration="2m1.506834097s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:01.504695895 +0000 UTC m=+142.188100161" watchObservedRunningTime="2025-12-05 23:22:01.506834097 +0000 UTC m=+142.190238373" Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.524469 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lr2g"] Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.559758 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.598510 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:01 crc kubenswrapper[4734]: E1205 23:22:01.599879 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:02.099864793 +0000 UTC m=+142.783269069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.632592 4734 patch_prober.go:28] interesting pod/router-default-5444994796-2djpb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 23:22:01 crc kubenswrapper[4734]: [-]has-synced failed: reason withheld Dec 05 23:22:01 crc kubenswrapper[4734]: [+]process-running ok Dec 05 23:22:01 crc kubenswrapper[4734]: healthz check failed Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.632706 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2djpb" podUID="8251ea29-3180-4d6c-a6f7-6477bcd8ed6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.634224 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-2djpb" podStartSLOduration=121.634204487 podStartE2EDuration="2m1.634204487s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:01.633652994 +0000 UTC m=+142.317057270" watchObservedRunningTime="2025-12-05 23:22:01.634204487 +0000 UTC m=+142.317608763" Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.709490 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:01 crc kubenswrapper[4734]: E1205 23:22:01.756286 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:02.256254995 +0000 UTC m=+142.939659261 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.818432 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-5h9wr" podStartSLOduration=121.818397253 podStartE2EDuration="2m1.818397253s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:01.78940409 +0000 UTC m=+142.472808366" watchObservedRunningTime="2025-12-05 23:22:01.818397253 +0000 UTC m=+142.501801529" Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.866245 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:01 crc kubenswrapper[4734]: E1205 23:22:01.866863 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:02.366840363 +0000 UTC m=+143.050244639 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.875151 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ckxzz"] Dec 05 23:22:01 crc kubenswrapper[4734]: W1205 23:22:01.896420 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda20501a9_7a2f_46ba_8322_a9b38d14bb4a.slice/crio-f09de4e4e33c59e0613998beedc0c7ce763a1f258f77c4ba41d45e7eb6c18dc6 WatchSource:0}: Error finding container f09de4e4e33c59e0613998beedc0c7ce763a1f258f77c4ba41d45e7eb6c18dc6: Status 404 returned error can't find the container with id f09de4e4e33c59e0613998beedc0c7ce763a1f258f77c4ba41d45e7eb6c18dc6 Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.969060 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:01 crc kubenswrapper[4734]: E1205 23:22:01.969320 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:02.46927024 +0000 UTC m=+143.152674526 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:01 crc kubenswrapper[4734]: I1205 23:22:01.969749 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:01 crc kubenswrapper[4734]: E1205 23:22:01.970257 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:02.470249304 +0000 UTC m=+143.153653570 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.044660 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dqz9r"] Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.070901 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:02 crc kubenswrapper[4734]: E1205 23:22:02.071423 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:02.57140634 +0000 UTC m=+143.254810606 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.073115 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tsb97"] Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.082394 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrqvs"] Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.082458 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-w5bmh"] Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.096216 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jlxs"] Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.096506 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qvhvd"] Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.119713 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6l9sd"] Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.140416 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4htrx"] Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.177348 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:02 crc kubenswrapper[4734]: E1205 23:22:02.177801 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:02.677788454 +0000 UTC m=+143.361192730 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.225988 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ds2cw"] Dec 05 23:22:02 crc kubenswrapper[4734]: W1205 23:22:02.275196 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8fec47c_0cfe_45b1_9f45_5eba4c924359.slice/crio-25ddd09f94ba2fd4c7ec805329c2fdfc5d5715e68cf95901e0adea0ba59651bb WatchSource:0}: Error finding container 25ddd09f94ba2fd4c7ec805329c2fdfc5d5715e68cf95901e0adea0ba59651bb: Status 404 returned error can't find the container with id 25ddd09f94ba2fd4c7ec805329c2fdfc5d5715e68cf95901e0adea0ba59651bb Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.282658 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:02 crc kubenswrapper[4734]: E1205 23:22:02.283455 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:02.783435879 +0000 UTC m=+143.466840155 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.284016 4734 patch_prober.go:28] interesting pod/router-default-5444994796-2djpb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 23:22:02 crc kubenswrapper[4734]: [-]has-synced failed: reason withheld Dec 05 23:22:02 crc kubenswrapper[4734]: [+]process-running ok Dec 05 23:22:02 crc kubenswrapper[4734]: healthz check failed Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.284086 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2djpb" podUID="8251ea29-3180-4d6c-a6f7-6477bcd8ed6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.337295 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qmqbr" event={"ID":"5069f4e4-270c-4fa2-9121-e6da86b389d1","Type":"ContainerStarted","Data":"4a0abc8cc1a3e8943f7e3eb23af3e0e174cc8ecd0e727b290ac545e5d81ceb83"} Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.375787 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zwcnb" event={"ID":"eb389c3f-69cc-49bd-b413-3fbbf370ad41","Type":"ContainerStarted","Data":"24f8c6c4267c13e699ebcf745775541d4f907a9d30f08875caa90dfb672c4a23"} Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.375857 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zwcnb" event={"ID":"eb389c3f-69cc-49bd-b413-3fbbf370ad41","Type":"ContainerStarted","Data":"fd0c7ae9c1e84e3b123685521754c8ffb9fbe4b5ed091ef444f2bad6179b80e7"} Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.378428 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-qmqbr" podStartSLOduration=122.378403013 podStartE2EDuration="2m2.378403013s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:02.376804294 +0000 UTC m=+143.060208570" watchObservedRunningTime="2025-12-05 23:22:02.378403013 +0000 UTC m=+143.061807289" Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.385467 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:02 crc kubenswrapper[4734]: E1205 23:22:02.386110 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:02.886093601 +0000 UTC m=+143.569497887 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.418259 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zwcnb" podStartSLOduration=122.418240201 podStartE2EDuration="2m2.418240201s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:02.417480793 +0000 UTC m=+143.100885069" watchObservedRunningTime="2025-12-05 23:22:02.418240201 +0000 UTC m=+143.101644477" Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.445621 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-m6c2b" event={"ID":"64b05eab-74bd-43bd-b206-54f4e784e581","Type":"ContainerStarted","Data":"637fada9c6cbb0a0061d269c38f37463a533247b7b6f2fb50c0828e85d76e0bf"} Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.466114 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-svcq4" event={"ID":"a58228cf-5189-4c26-b772-a1c2145873a0","Type":"ContainerStarted","Data":"8e86e9e4fa5c6498e82ffb5bde0f892af9b11152d74b8c1ac5480b385b4c913f"} Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.490603 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:02 crc kubenswrapper[4734]: E1205 23:22:02.492633 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:02.992589368 +0000 UTC m=+143.675993634 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.501021 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-mghs5" event={"ID":"d5a2bf75-6d4d-40e9-a4d2-0aa192d25cc8","Type":"ContainerStarted","Data":"5b1aceadcffe300bc58c1e47686c28ce877b69542cb41ac40d97e4e0ccfaa543"} Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.531485 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2dx28" event={"ID":"a0152808-f0b7-4ce4-9bc1-6bc11e69bd7f","Type":"ContainerStarted","Data":"c89577ae2cac9391439791c6d9f013b0aa30636498adaad61399049b8a1429a8"} Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.553413 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-htwjw" event={"ID":"d391f1fa-9bbe-478c-a1da-6ccb8f75f3c5","Type":"ContainerStarted","Data":"03e71a4d79e91799f48b92c51d3cfea79b11516f8b0e43795e10129ef3634c1b"} Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.555089 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-htwjw" Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.573459 4734 patch_prober.go:28] interesting pod/downloads-7954f5f757-htwjw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.573555 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-htwjw" podUID="d391f1fa-9bbe-478c-a1da-6ccb8f75f3c5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.571507 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-mghs5" podStartSLOduration=6.571479117 podStartE2EDuration="6.571479117s" podCreationTimestamp="2025-12-05 23:21:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:02.568986685 +0000 UTC m=+143.252390961" watchObservedRunningTime="2025-12-05 23:22:02.571479117 +0000 UTC m=+143.254883383" Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.593571 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:02 crc kubenswrapper[4734]: E1205 23:22:02.596416 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:03.096401368 +0000 UTC m=+143.779805634 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.618734 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wl2fs" event={"ID":"99827392-eef9-4b43-ab05-d57f8bc8d3ef","Type":"ContainerStarted","Data":"382a48176a6e3d2f98ebce0ad7f1124f30a86b3e6787790ccfdbedd3c9402b54"} Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.631761 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2dx28" podStartSLOduration=122.631739657 podStartE2EDuration="2m2.631739657s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:02.63062044 +0000 UTC m=+143.314024716" watchObservedRunningTime="2025-12-05 23:22:02.631739657 +0000 UTC m=+143.315143933" Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.655647 4734 generic.go:334] "Generic (PLEG): container finished" podID="776e53fa-bf9e-44c4-8f89-2f78059733a7" containerID="af559860fa949935b5a4c2ab8de663c8b3dad44a5b93bba2fbdb8d00ad1b7bb3" exitCode=0 Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.656410 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-txdvl" event={"ID":"776e53fa-bf9e-44c4-8f89-2f78059733a7","Type":"ContainerDied","Data":"af559860fa949935b5a4c2ab8de663c8b3dad44a5b93bba2fbdb8d00ad1b7bb3"} Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.670268 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-htwjw" podStartSLOduration=122.670250493 podStartE2EDuration="2m2.670250493s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:02.663839516 +0000 UTC m=+143.347243792" watchObservedRunningTime="2025-12-05 23:22:02.670250493 +0000 UTC m=+143.353654769" Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.697458 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ckxzz" event={"ID":"17750e75-596f-4637-a240-55aabb725a86","Type":"ContainerStarted","Data":"c2c9afdd5ce8aad9069852775962c411728df18696b4559dd0c444addf1cf78d"} Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.699079 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:02 crc kubenswrapper[4734]: E1205 23:22:02.700402 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:03.200379604 +0000 UTC m=+143.883783880 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.742957 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kkt7h" event={"ID":"9e08ddd6-cfa7-4e6b-902f-d789f91fd70b","Type":"ContainerStarted","Data":"05cf7b6e81f62410a0a79d5530ab91800af6cce94b5431aadb19551b8ff4cb43"} Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.743248 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kkt7h" event={"ID":"9e08ddd6-cfa7-4e6b-902f-d789f91fd70b","Type":"ContainerStarted","Data":"a0d675edbb6ddd1e6cee4969e5d16b84474d19773e29a2f4e501ebd6b5d617bb"} Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.752846 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k9rd" event={"ID":"547872c0-29db-43b6-a531-14610127080d","Type":"ContainerStarted","Data":"2bc5e93cb493f0ea3ab4db2cc2ceff004920f9c22e3516627be1f0a7c6bdc77c"} Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.754775 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w5bmh" event={"ID":"c8fec47c-0cfe-45b1-9f45-5eba4c924359","Type":"ContainerStarted","Data":"25ddd09f94ba2fd4c7ec805329c2fdfc5d5715e68cf95901e0adea0ba59651bb"} Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.755441 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qvhvd" event={"ID":"7b7cfcc6-ee0e-4af5-9e03-5d8cbb40edbb","Type":"ContainerStarted","Data":"5402a3c741118e8aecc3e5d5d380eaeda535140c9fbb38894157552d4a66587d"} Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.762166 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" event={"ID":"07e87699-6af3-4f68-a3c8-85780433774b","Type":"ContainerStarted","Data":"6282a7b3f72444edbb6b44fa33339a79edda07c4a61b40d0523594195abc0c5a"} Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.785756 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" event={"ID":"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87","Type":"ContainerStarted","Data":"7ca3c043f4229050f39ccf624b9f967a42b32001cbf42d05b1942e0893eb4dbb"} Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.786552 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.793096 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4k9rd" podStartSLOduration=122.793082821 podStartE2EDuration="2m2.793082821s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:02.791223936 +0000 UTC m=+143.474628212" watchObservedRunningTime="2025-12-05 23:22:02.793082821 +0000 UTC m=+143.476487087" Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.797085 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dqz9r" event={"ID":"36efc3c0-8de6-423d-bb0e-c76488f53955","Type":"ContainerStarted","Data":"65329ead81138c27ef3d933ee45a7f71b0d0ca7829a51be105845f9947242d5d"} Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.801013 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:02 crc kubenswrapper[4734]: E1205 23:22:02.802101 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:03.302088182 +0000 UTC m=+143.985492458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.818746 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lr2g" event={"ID":"24632d76-79cd-400b-bfad-a4c8a0ffbb68","Type":"ContainerStarted","Data":"a2612df34546d468666f05f7fc846c32ad181a9eacb6b551cb7102a96bb8ed2a"} Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.838829 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9vxl" event={"ID":"de8d0e19-aece-4044-9eb8-ede1e5edda45","Type":"ContainerStarted","Data":"5eb18feefc24f36f3e239a01557373b519c54eb92026e3d3094c930147296f6e"} Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.838881 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9vxl" event={"ID":"de8d0e19-aece-4044-9eb8-ede1e5edda45","Type":"ContainerStarted","Data":"bedecf0cc9899e129944c3479cf0bf31fc24824d556dd3722987a18fec68dcac"} Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.850672 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-n9fhz" event={"ID":"eb3de870-5133-4874-bc83-37be5f299296","Type":"ContainerStarted","Data":"133cb1a4d54c2d5f231b0405873ee246b53366082b8e27202c6953c9bba2c9da"} Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.850745 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-n9fhz" event={"ID":"eb3de870-5133-4874-bc83-37be5f299296","Type":"ContainerStarted","Data":"088faa072c4480f2e1ff9cae96d18a258039290d828849815002f618c908b977"} Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.861439 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" podStartSLOduration=122.861414811 podStartE2EDuration="2m2.861414811s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:02.824910914 +0000 UTC m=+143.508315180" watchObservedRunningTime="2025-12-05 23:22:02.861414811 +0000 UTC m=+143.544819087" Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.861858 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" podStartSLOduration=122.861854231 podStartE2EDuration="2m2.861854231s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:02.859114354 +0000 UTC m=+143.542518630" watchObservedRunningTime="2025-12-05 23:22:02.861854231 +0000 UTC m=+143.545258507" Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.861468 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tsb97" event={"ID":"2e6d7ac5-5f33-4561-a79b-685f9ae74144","Type":"ContainerStarted","Data":"8bb839edda1460171daad921dcfd6499006abc6741ff13bfd3daef4e550e7b48"} Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.892958 4734 generic.go:334] "Generic (PLEG): container finished" podID="7adf273b-63fb-40fe-9d0d-fe467260565b" containerID="39e26dd4696cd8476884be963a45fe9648e12c2f15834d39b05a0363f66cf128" exitCode=0 Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.893610 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-x67qn" event={"ID":"7adf273b-63fb-40fe-9d0d-fe467260565b","Type":"ContainerDied","Data":"39e26dd4696cd8476884be963a45fe9648e12c2f15834d39b05a0363f66cf128"} Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.905096 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:02 crc kubenswrapper[4734]: E1205 23:22:02.906657 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:03.406636211 +0000 UTC m=+144.090040487 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.912237 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-n9fhz" podStartSLOduration=122.912210338 podStartE2EDuration="2m2.912210338s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:02.912014924 +0000 UTC m=+143.595419200" watchObservedRunningTime="2025-12-05 23:22:02.912210338 +0000 UTC m=+143.595614614" Dec 05 23:22:02 crc kubenswrapper[4734]: I1205 23:22:02.964557 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-mfkvp" event={"ID":"2a20dbad-8352-4804-9c0e-a2b6108a0d1b","Type":"ContainerStarted","Data":"c2ab13668511b3efa65133e7ec2f85f1d91583ee811fb50b4e0a228eac2de9b8"} Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.000732 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ws6qt" event={"ID":"0cd8fc51-deec-410b-b2bb-4818c2f71230","Type":"ContainerStarted","Data":"75d5b89b3d055e34efd85b0c0c92d7321eb0a77cc476c9d678773513d6d4e949"} Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.001703 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ws6qt" Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.008792 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:03 crc kubenswrapper[4734]: E1205 23:22:03.012182 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:03.512165234 +0000 UTC m=+144.195569510 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.018869 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lr2g" podStartSLOduration=123.018845998 podStartE2EDuration="2m3.018845998s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:03.017427554 +0000 UTC m=+143.700831830" watchObservedRunningTime="2025-12-05 23:22:03.018845998 +0000 UTC m=+143.702250274" Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.022567 4734 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-ws6qt container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.022623 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-ws6qt" podUID="0cd8fc51-deec-410b-b2bb-4818c2f71230" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.071751 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qqvvd" event={"ID":"82e59745-0a0a-4c7c-a61b-d801aed4d11d","Type":"ContainerStarted","Data":"29923c31c1b70f548ab5185e157e827b52e154cf03d31891589c36c344b1cf7d"} Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.102992 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrqvs" event={"ID":"0de1aed3-e393-4d4f-b201-12142736c664","Type":"ContainerStarted","Data":"bd2732ee7263c284db50734df6ef9963068c063106907816480797b49ec1e56c"} Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.104341 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrqvs" Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.110477 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:03 crc kubenswrapper[4734]: E1205 23:22:03.111966 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:03.611947186 +0000 UTC m=+144.295351462 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.154041 4734 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-mrqvs container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.154578 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrqvs" podUID="0de1aed3-e393-4d4f-b201-12142736c664" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.175285 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5h9wr" event={"ID":"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe","Type":"ContainerStarted","Data":"a4a73def955379da1e0b0bb1e059e2e98a293893f07530194f7dd08dad90854a"} Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.213162 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:03 crc kubenswrapper[4734]: E1205 23:22:03.213641 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:03.713622744 +0000 UTC m=+144.397027020 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.217879 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-ws6qt" podStartSLOduration=123.217847788 podStartE2EDuration="2m3.217847788s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:03.206081329 +0000 UTC m=+143.889485595" watchObservedRunningTime="2025-12-05 23:22:03.217847788 +0000 UTC m=+143.901252064" Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.239629 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wmxp" event={"ID":"a20501a9-7a2f-46ba-8322-a9b38d14bb4a","Type":"ContainerStarted","Data":"f09de4e4e33c59e0613998beedc0c7ce763a1f258f77c4ba41d45e7eb6c18dc6"} Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.240211 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wmxp" Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.271678 4734 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-4wmxp container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.271761 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wmxp" podUID="a20501a9-7a2f-46ba-8322-a9b38d14bb4a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.274017 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6l9sd" event={"ID":"3be50f08-5e27-434b-8862-52c075569d6d","Type":"ContainerStarted","Data":"cf98b257020c66cf090336df69c447289a29ccf32143fc60672398d3deca283d"} Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.279117 4734 patch_prober.go:28] interesting pod/router-default-5444994796-2djpb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 23:22:03 crc kubenswrapper[4734]: [-]has-synced failed: reason withheld Dec 05 23:22:03 crc kubenswrapper[4734]: [+]process-running ok Dec 05 23:22:03 crc kubenswrapper[4734]: healthz check failed Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.279152 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2djpb" podUID="8251ea29-3180-4d6c-a6f7-6477bcd8ed6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.319876 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k9fv6" event={"ID":"0663418a-98b9-48b4-869e-257a7bddd32e","Type":"ContainerStarted","Data":"cac7ee8158cda7b41327d9a23df525b32d23240c6db3dccc256c76aea3e23062"} Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.322999 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-mfkvp" podStartSLOduration=123.322967411 podStartE2EDuration="2m3.322967411s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:03.274408798 +0000 UTC m=+143.957813074" watchObservedRunningTime="2025-12-05 23:22:03.322967411 +0000 UTC m=+144.006371687" Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.323604 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:03 crc kubenswrapper[4734]: E1205 23:22:03.326788 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:03.826753594 +0000 UTC m=+144.510157870 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.432323 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wmxp" podStartSLOduration=123.432299058 podStartE2EDuration="2m3.432299058s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:03.401400648 +0000 UTC m=+144.084804924" watchObservedRunningTime="2025-12-05 23:22:03.432299058 +0000 UTC m=+144.115703334" Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.435595 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:03 crc kubenswrapper[4734]: E1205 23:22:03.435933 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:03.935919226 +0000 UTC m=+144.619323502 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.481803 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qqvvd" podStartSLOduration=123.481779823 podStartE2EDuration="2m3.481779823s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:03.481436725 +0000 UTC m=+144.164841001" watchObservedRunningTime="2025-12-05 23:22:03.481779823 +0000 UTC m=+144.165184099" Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.490506 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k9fv6" podStartSLOduration=123.490483937 podStartE2EDuration="2m3.490483937s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:03.434750228 +0000 UTC m=+144.118154504" watchObservedRunningTime="2025-12-05 23:22:03.490483937 +0000 UTC m=+144.173888213" Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.538009 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:03 crc kubenswrapper[4734]: E1205 23:22:03.538430 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:04.038409994 +0000 UTC m=+144.721814260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.640099 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:03 crc kubenswrapper[4734]: E1205 23:22:03.640903 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:04.140890402 +0000 UTC m=+144.824294678 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.742085 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:03 crc kubenswrapper[4734]: E1205 23:22:03.742495 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:04.242476719 +0000 UTC m=+144.925880995 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.787573 4734 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-gxdpj container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.8:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.787635 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" podUID="e40087a5-cb18-4d2f-8e68-cc6e09c5bd87" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.8:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.844750 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:03 crc kubenswrapper[4734]: E1205 23:22:03.845227 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:04.345206172 +0000 UTC m=+145.028610448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.949320 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:03 crc kubenswrapper[4734]: E1205 23:22:03.949582 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:04.449543387 +0000 UTC m=+145.132947663 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:03 crc kubenswrapper[4734]: I1205 23:22:03.949790 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:03 crc kubenswrapper[4734]: E1205 23:22:03.950179 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:04.450171172 +0000 UTC m=+145.133575448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.051051 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:04 crc kubenswrapper[4734]: E1205 23:22:04.051238 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:04.551177744 +0000 UTC m=+145.234582020 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.051804 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:04 crc kubenswrapper[4734]: E1205 23:22:04.052300 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:04.552289211 +0000 UTC m=+145.235693487 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.153427 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:04 crc kubenswrapper[4734]: E1205 23:22:04.154012 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:04.65399342 +0000 UTC m=+145.337397696 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.255843 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:04 crc kubenswrapper[4734]: E1205 23:22:04.256407 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:04.756388136 +0000 UTC m=+145.439792412 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.281756 4734 patch_prober.go:28] interesting pod/router-default-5444994796-2djpb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 23:22:04 crc kubenswrapper[4734]: [-]has-synced failed: reason withheld Dec 05 23:22:04 crc kubenswrapper[4734]: [+]process-running ok Dec 05 23:22:04 crc kubenswrapper[4734]: healthz check failed Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.281822 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2djpb" podUID="8251ea29-3180-4d6c-a6f7-6477bcd8ed6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.352602 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ws6qt" event={"ID":"0cd8fc51-deec-410b-b2bb-4818c2f71230","Type":"ContainerStarted","Data":"62cf1aaf8f798a7c616550fba162e6d45a92efe82a855b1f8651d9c990978bfe"} Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.354332 4734 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-ws6qt container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.354386 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-ws6qt" podUID="0cd8fc51-deec-410b-b2bb-4818c2f71230" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.356626 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:04 crc kubenswrapper[4734]: E1205 23:22:04.356992 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:04.856971957 +0000 UTC m=+145.540376233 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.362828 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tsb97" event={"ID":"2e6d7ac5-5f33-4561-a79b-685f9ae74144","Type":"ContainerStarted","Data":"77ab4dbd0a33885a30359fd838e90c31146c21a235ff040c377928eeb1ee4bac"} Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.362880 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tsb97" event={"ID":"2e6d7ac5-5f33-4561-a79b-685f9ae74144","Type":"ContainerStarted","Data":"6165dbae1079b3c80de6f82e3d102696a0fa808a7c48b95e1cca88fe449ce51b"} Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.363665 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tsb97" Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.378072 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrqvs" event={"ID":"0de1aed3-e393-4d4f-b201-12142736c664","Type":"ContainerStarted","Data":"4073cd245223d1697d6a6ba395f0b3d9121639128fa63f67db633e1d8a812ac2"} Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.393358 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrqvs" Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.407783 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-m6c2b" event={"ID":"64b05eab-74bd-43bd-b206-54f4e784e581","Type":"ContainerStarted","Data":"a1486b28cc94b7090891654d679750aef1c1e77ad058b22deb5f79ea38ecb4c5"} Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.408357 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mrqvs" podStartSLOduration=124.40833714 podStartE2EDuration="2m4.40833714s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:03.522993616 +0000 UTC m=+144.206397892" watchObservedRunningTime="2025-12-05 23:22:04.40833714 +0000 UTC m=+145.091741416" Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.439118 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tsb97" podStartSLOduration=124.439098266 podStartE2EDuration="2m4.439098266s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:04.408040172 +0000 UTC m=+145.091444448" watchObservedRunningTime="2025-12-05 23:22:04.439098266 +0000 UTC m=+145.122502542" Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.459976 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:04 crc kubenswrapper[4734]: E1205 23:22:04.463032 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:04.963005913 +0000 UTC m=+145.646410189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.463845 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-m6c2b" podStartSLOduration=124.463824072 podStartE2EDuration="2m4.463824072s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:04.457010485 +0000 UTC m=+145.140414761" watchObservedRunningTime="2025-12-05 23:22:04.463824072 +0000 UTC m=+145.147228348" Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.482180 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-svcq4" event={"ID":"a58228cf-5189-4c26-b772-a1c2145873a0","Type":"ContainerStarted","Data":"089a5c1623fd2e3dc0ef8fbec8b3c6f82f443d0c5661728bb37c2a5d0edf994f"} Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.499061 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4htrx" event={"ID":"eae91bc6-fbec-4bb5-81f7-254dc473427e","Type":"ContainerStarted","Data":"d9292cf2077c75e1da031bb30af0d6c35723af9bc60a8873a3e6a86d3663db6d"} Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.499132 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4htrx" event={"ID":"eae91bc6-fbec-4bb5-81f7-254dc473427e","Type":"ContainerStarted","Data":"a71cfd59d51d51f5e37a4738a90f73b062a6845292730a53aa9611f4dd8f9452"} Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.499331 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-4htrx" Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.501448 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jlxs" event={"ID":"05c3d993-bbb4-4f67-8952-23d9b107b889","Type":"ContainerStarted","Data":"c4749fa353eee25b0c96ede83b007df9eb8c3b685eef863f2a6838b86e8ad469"} Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.501482 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jlxs" event={"ID":"05c3d993-bbb4-4f67-8952-23d9b107b889","Type":"ContainerStarted","Data":"4e4ef02c6dc8319a5eabbb1f2609c25bd37d69728c90b20ad5aabbb7238bc7b5"} Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.501872 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jlxs" Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.520829 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-x67qn" event={"ID":"7adf273b-63fb-40fe-9d0d-fe467260565b","Type":"ContainerStarted","Data":"58cc2bcd5bea289f116d6f4b223178498a97480df1b14700692abba4312d37f2"} Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.520887 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-x67qn" event={"ID":"7adf273b-63fb-40fe-9d0d-fe467260565b","Type":"ContainerStarted","Data":"6535f7dc4d084fd3302398c9a892e9bec9c14809165b4103be2a7bb4512ac6b3"} Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.541050 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wl2fs" event={"ID":"99827392-eef9-4b43-ab05-d57f8bc8d3ef","Type":"ContainerStarted","Data":"0c11353b4c42d308089ca938fd5fb3493bd4994a6a89928481b62297cbbbfdf8"} Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.562274 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-svcq4" podStartSLOduration=124.562230821 podStartE2EDuration="2m4.562230821s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:04.521003668 +0000 UTC m=+145.204407944" watchObservedRunningTime="2025-12-05 23:22:04.562230821 +0000 UTC m=+145.245635097" Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.562579 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.562857 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4htrx" podStartSLOduration=8.562850956 podStartE2EDuration="8.562850956s" podCreationTimestamp="2025-12-05 23:21:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:04.561899172 +0000 UTC m=+145.245303448" watchObservedRunningTime="2025-12-05 23:22:04.562850956 +0000 UTC m=+145.246255232" Dec 05 23:22:04 crc kubenswrapper[4734]: E1205 23:22:04.563439 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:05.063402689 +0000 UTC m=+145.746806965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.613865 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dqz9r" event={"ID":"36efc3c0-8de6-423d-bb0e-c76488f53955","Type":"ContainerStarted","Data":"a1d043a5f01bddde97864f9ba5b62b0875d1045e6f98c08498198e2d550d1e5c"} Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.613944 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dqz9r" event={"ID":"36efc3c0-8de6-423d-bb0e-c76488f53955","Type":"ContainerStarted","Data":"60dd5c30f071a0b3c11498809e3dbdf573263a71c0104111b1481ebb9638234e"} Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.624160 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9lr2g" event={"ID":"24632d76-79cd-400b-bfad-a4c8a0ffbb68","Type":"ContainerStarted","Data":"b09fbe42de03ef1d48616afafb19c3d245343c63bcfe7c54e858ff0d6ffdfec0"} Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.626387 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w5bmh" event={"ID":"c8fec47c-0cfe-45b1-9f45-5eba4c924359","Type":"ContainerStarted","Data":"7721d8a72f12177447fa76f333d6d34f29a3cffe9aa46e659f3138aa8dd5fb36"} Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.636422 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-x67qn" podStartSLOduration=124.636396072 podStartE2EDuration="2m4.636396072s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:04.627993176 +0000 UTC m=+145.311397452" watchObservedRunningTime="2025-12-05 23:22:04.636396072 +0000 UTC m=+145.319800338" Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.639422 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6l9sd" event={"ID":"3be50f08-5e27-434b-8862-52c075569d6d","Type":"ContainerStarted","Data":"a16ef35a1e204fbdaa1f05fa01cb857903402a5ce49f0e6b1eaebf360340d1a7"} Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.653866 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-ds2cw" event={"ID":"8f1fb91a-4e37-4bad-baa4-4996c7dd06e8","Type":"ContainerStarted","Data":"3e55cb0bafa2fa7b97ec985cedf0a3263d9f6ee24f6f2e29dc6ed342777608c7"} Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.653925 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-ds2cw" event={"ID":"8f1fb91a-4e37-4bad-baa4-4996c7dd06e8","Type":"ContainerStarted","Data":"97b4c09d3ba59c874b3c85f15f5d218d370afeaede0da1f21a199f3e0d4a0705"} Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.668467 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:04 crc kubenswrapper[4734]: E1205 23:22:04.670010 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:05.169992778 +0000 UTC m=+145.853397054 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.669506 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jlxs" podStartSLOduration=124.669492096 podStartE2EDuration="2m4.669492096s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:04.666101793 +0000 UTC m=+145.349506069" watchObservedRunningTime="2025-12-05 23:22:04.669492096 +0000 UTC m=+145.352896372" Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.684684 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-txdvl" event={"ID":"776e53fa-bf9e-44c4-8f89-2f78059733a7","Type":"ContainerStarted","Data":"16cca1f0d18dfc30938a83a04d2d742b875f541c72150e37804195222d0f7bdc"} Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.685386 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-txdvl" Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.704279 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wmxp" event={"ID":"a20501a9-7a2f-46ba-8322-a9b38d14bb4a","Type":"ContainerStarted","Data":"4e56b526ea48d536299ce0cf4d1e6a5afba951ea338ef1797ef7dacc09867294"} Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.705709 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dqz9r" podStartSLOduration=124.705686046 podStartE2EDuration="2m4.705686046s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:04.704834984 +0000 UTC m=+145.388239280" watchObservedRunningTime="2025-12-05 23:22:04.705686046 +0000 UTC m=+145.389090322" Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.733027 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ckxzz" event={"ID":"17750e75-596f-4637-a240-55aabb725a86","Type":"ContainerStarted","Data":"ea0ec2cde9451a8e77ad9252dd46a22f4f5eefc33a0405b12f875bb45e024201"} Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.733363 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4wmxp" Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.766890 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kkt7h" event={"ID":"9e08ddd6-cfa7-4e6b-902f-d789f91fd70b","Type":"ContainerStarted","Data":"b200feb9ba44ff46cb2ff8c82b0cb6485aa35567bb73faa496042b47fa251a19"} Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.772288 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:04 crc kubenswrapper[4734]: E1205 23:22:04.773162 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:05.273143743 +0000 UTC m=+145.956548019 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.784498 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6l9sd" podStartSLOduration=8.784474271 podStartE2EDuration="8.784474271s" podCreationTimestamp="2025-12-05 23:21:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:04.781343944 +0000 UTC m=+145.464748240" watchObservedRunningTime="2025-12-05 23:22:04.784474271 +0000 UTC m=+145.467878537" Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.797502 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qvhvd" event={"ID":"7b7cfcc6-ee0e-4af5-9e03-5d8cbb40edbb","Type":"ContainerStarted","Data":"919efff612bd1f8548819e1c7aa3d98aa798e67205b26b2b4100ee73953938ca"} Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.830810 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9vxl" event={"ID":"de8d0e19-aece-4044-9eb8-ede1e5edda45","Type":"ContainerStarted","Data":"f68ed3fd9d0040b17ecdcdfbfd0b3c9725538ae202e2e8d37902d4339ab47cfd"} Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.847164 4734 patch_prober.go:28] interesting pod/downloads-7954f5f757-htwjw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.847257 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-htwjw" podUID="d391f1fa-9bbe-478c-a1da-6ccb8f75f3c5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.857426 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.875196 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:04 crc kubenswrapper[4734]: E1205 23:22:04.878450 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:05.37843174 +0000 UTC m=+146.061836016 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.922303 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-ds2cw" podStartSLOduration=124.922279808 podStartE2EDuration="2m4.922279808s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:04.874293768 +0000 UTC m=+145.557698034" watchObservedRunningTime="2025-12-05 23:22:04.922279808 +0000 UTC m=+145.605684084" Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.924079 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wl2fs" podStartSLOduration=124.924069131 podStartE2EDuration="2m4.924069131s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:04.922100093 +0000 UTC m=+145.605504379" watchObservedRunningTime="2025-12-05 23:22:04.924069131 +0000 UTC m=+145.607473407" Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.967515 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qvhvd" podStartSLOduration=124.967493058 podStartE2EDuration="2m4.967493058s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:04.964518485 +0000 UTC m=+145.647922761" watchObservedRunningTime="2025-12-05 23:22:04.967493058 +0000 UTC m=+145.650897324" Dec 05 23:22:04 crc kubenswrapper[4734]: I1205 23:22:04.976961 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:04 crc kubenswrapper[4734]: E1205 23:22:04.978865 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:05.478845567 +0000 UTC m=+146.162249843 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.079309 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-txdvl" podStartSLOduration=125.079281495 podStartE2EDuration="2m5.079281495s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:05.028409215 +0000 UTC m=+145.711813491" watchObservedRunningTime="2025-12-05 23:22:05.079281495 +0000 UTC m=+145.762685771" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.079998 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:05 crc kubenswrapper[4734]: E1205 23:22:05.080442 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:05.580428153 +0000 UTC m=+146.263832419 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.133069 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kkt7h" podStartSLOduration=125.133044456 podStartE2EDuration="2m5.133044456s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:05.086934363 +0000 UTC m=+145.770338639" watchObservedRunningTime="2025-12-05 23:22:05.133044456 +0000 UTC m=+145.816448732" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.160037 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ckxzz" podStartSLOduration=125.160014848 podStartE2EDuration="2m5.160014848s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:05.156770829 +0000 UTC m=+145.840175125" watchObservedRunningTime="2025-12-05 23:22:05.160014848 +0000 UTC m=+145.843419144" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.160189 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-d9vxl" podStartSLOduration=125.160182463 podStartE2EDuration="2m5.160182463s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:05.135973298 +0000 UTC m=+145.819377574" watchObservedRunningTime="2025-12-05 23:22:05.160182463 +0000 UTC m=+145.843586739" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.190816 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:05 crc kubenswrapper[4734]: E1205 23:22:05.191395 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:05.691365169 +0000 UTC m=+146.374769445 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.279153 4734 patch_prober.go:28] interesting pod/router-default-5444994796-2djpb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 23:22:05 crc kubenswrapper[4734]: [-]has-synced failed: reason withheld Dec 05 23:22:05 crc kubenswrapper[4734]: [+]process-running ok Dec 05 23:22:05 crc kubenswrapper[4734]: healthz check failed Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.279248 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2djpb" podUID="8251ea29-3180-4d6c-a6f7-6477bcd8ed6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.293830 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:05 crc kubenswrapper[4734]: E1205 23:22:05.294352 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:05.794316088 +0000 UTC m=+146.477720364 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.342641 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hmv47"] Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.343650 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hmv47" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.349898 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.367887 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hmv47"] Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.395220 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.395439 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e24c08e-fb74-4ae6-9c48-ae9653c964e8-catalog-content\") pod \"certified-operators-hmv47\" (UID: \"7e24c08e-fb74-4ae6-9c48-ae9653c964e8\") " pod="openshift-marketplace/certified-operators-hmv47" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.395545 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grpqv\" (UniqueName: \"kubernetes.io/projected/7e24c08e-fb74-4ae6-9c48-ae9653c964e8-kube-api-access-grpqv\") pod \"certified-operators-hmv47\" (UID: \"7e24c08e-fb74-4ae6-9c48-ae9653c964e8\") " pod="openshift-marketplace/certified-operators-hmv47" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.395603 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e24c08e-fb74-4ae6-9c48-ae9653c964e8-utilities\") pod \"certified-operators-hmv47\" (UID: \"7e24c08e-fb74-4ae6-9c48-ae9653c964e8\") " pod="openshift-marketplace/certified-operators-hmv47" Dec 05 23:22:05 crc kubenswrapper[4734]: E1205 23:22:05.395658 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:05.895627688 +0000 UTC m=+146.579031964 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.497401 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.497848 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grpqv\" (UniqueName: \"kubernetes.io/projected/7e24c08e-fb74-4ae6-9c48-ae9653c964e8-kube-api-access-grpqv\") pod \"certified-operators-hmv47\" (UID: \"7e24c08e-fb74-4ae6-9c48-ae9653c964e8\") " pod="openshift-marketplace/certified-operators-hmv47" Dec 05 23:22:05 crc kubenswrapper[4734]: E1205 23:22:05.497954 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:05.997923961 +0000 UTC m=+146.681328237 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.498239 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e24c08e-fb74-4ae6-9c48-ae9653c964e8-utilities\") pod \"certified-operators-hmv47\" (UID: \"7e24c08e-fb74-4ae6-9c48-ae9653c964e8\") " pod="openshift-marketplace/certified-operators-hmv47" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.498298 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e24c08e-fb74-4ae6-9c48-ae9653c964e8-catalog-content\") pod \"certified-operators-hmv47\" (UID: \"7e24c08e-fb74-4ae6-9c48-ae9653c964e8\") " pod="openshift-marketplace/certified-operators-hmv47" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.498735 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e24c08e-fb74-4ae6-9c48-ae9653c964e8-catalog-content\") pod \"certified-operators-hmv47\" (UID: \"7e24c08e-fb74-4ae6-9c48-ae9653c964e8\") " pod="openshift-marketplace/certified-operators-hmv47" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.499065 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e24c08e-fb74-4ae6-9c48-ae9653c964e8-utilities\") pod \"certified-operators-hmv47\" (UID: \"7e24c08e-fb74-4ae6-9c48-ae9653c964e8\") " pod="openshift-marketplace/certified-operators-hmv47" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.506421 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cb4rj"] Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.507504 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cb4rj" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.508260 4734 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4jlxs container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.508303 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jlxs" podUID="05c3d993-bbb4-4f67-8952-23d9b107b889" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.24:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.515413 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.530100 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grpqv\" (UniqueName: \"kubernetes.io/projected/7e24c08e-fb74-4ae6-9c48-ae9653c964e8-kube-api-access-grpqv\") pod \"certified-operators-hmv47\" (UID: \"7e24c08e-fb74-4ae6-9c48-ae9653c964e8\") " pod="openshift-marketplace/certified-operators-hmv47" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.533828 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cb4rj"] Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.599185 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:05 crc kubenswrapper[4734]: E1205 23:22:05.599413 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:06.099374464 +0000 UTC m=+146.782778740 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.599583 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.599615 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bsvp\" (UniqueName: \"kubernetes.io/projected/597348be-fe32-4495-bb10-d152ed593e3e-kube-api-access-8bsvp\") pod \"community-operators-cb4rj\" (UID: \"597348be-fe32-4495-bb10-d152ed593e3e\") " pod="openshift-marketplace/community-operators-cb4rj" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.599672 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/597348be-fe32-4495-bb10-d152ed593e3e-utilities\") pod \"community-operators-cb4rj\" (UID: \"597348be-fe32-4495-bb10-d152ed593e3e\") " pod="openshift-marketplace/community-operators-cb4rj" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.599734 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/597348be-fe32-4495-bb10-d152ed593e3e-catalog-content\") pod \"community-operators-cb4rj\" (UID: \"597348be-fe32-4495-bb10-d152ed593e3e\") " pod="openshift-marketplace/community-operators-cb4rj" Dec 05 23:22:05 crc kubenswrapper[4734]: E1205 23:22:05.600219 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:06.100208305 +0000 UTC m=+146.783612581 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.667304 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hmv47" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.700585 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:05 crc kubenswrapper[4734]: E1205 23:22:05.700831 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:06.200799876 +0000 UTC m=+146.884204152 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.701097 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/597348be-fe32-4495-bb10-d152ed593e3e-catalog-content\") pod \"community-operators-cb4rj\" (UID: \"597348be-fe32-4495-bb10-d152ed593e3e\") " pod="openshift-marketplace/community-operators-cb4rj" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.701231 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.701271 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bsvp\" (UniqueName: \"kubernetes.io/projected/597348be-fe32-4495-bb10-d152ed593e3e-kube-api-access-8bsvp\") pod \"community-operators-cb4rj\" (UID: \"597348be-fe32-4495-bb10-d152ed593e3e\") " pod="openshift-marketplace/community-operators-cb4rj" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.701868 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/597348be-fe32-4495-bb10-d152ed593e3e-utilities\") pod \"community-operators-cb4rj\" (UID: \"597348be-fe32-4495-bb10-d152ed593e3e\") " pod="openshift-marketplace/community-operators-cb4rj" Dec 05 23:22:05 crc kubenswrapper[4734]: E1205 23:22:05.701979 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:06.201966115 +0000 UTC m=+146.885370391 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.702067 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/597348be-fe32-4495-bb10-d152ed593e3e-catalog-content\") pod \"community-operators-cb4rj\" (UID: \"597348be-fe32-4495-bb10-d152ed593e3e\") " pod="openshift-marketplace/community-operators-cb4rj" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.702318 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/597348be-fe32-4495-bb10-d152ed593e3e-utilities\") pod \"community-operators-cb4rj\" (UID: \"597348be-fe32-4495-bb10-d152ed593e3e\") " pod="openshift-marketplace/community-operators-cb4rj" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.709615 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cbr82"] Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.711067 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cbr82" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.734266 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cbr82"] Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.754567 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bsvp\" (UniqueName: \"kubernetes.io/projected/597348be-fe32-4495-bb10-d152ed593e3e-kube-api-access-8bsvp\") pod \"community-operators-cb4rj\" (UID: \"597348be-fe32-4495-bb10-d152ed593e3e\") " pod="openshift-marketplace/community-operators-cb4rj" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.804386 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.804695 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmd7l\" (UniqueName: \"kubernetes.io/projected/e6e9b180-8bc8-4f84-b1f7-ec822b6f6560-kube-api-access-hmd7l\") pod \"certified-operators-cbr82\" (UID: \"e6e9b180-8bc8-4f84-b1f7-ec822b6f6560\") " pod="openshift-marketplace/certified-operators-cbr82" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.804734 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6e9b180-8bc8-4f84-b1f7-ec822b6f6560-utilities\") pod \"certified-operators-cbr82\" (UID: \"e6e9b180-8bc8-4f84-b1f7-ec822b6f6560\") " pod="openshift-marketplace/certified-operators-cbr82" Dec 05 23:22:05 crc kubenswrapper[4734]: E1205 23:22:05.804788 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:06.30475537 +0000 UTC m=+146.988159646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.805080 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6e9b180-8bc8-4f84-b1f7-ec822b6f6560-catalog-content\") pod \"certified-operators-cbr82\" (UID: \"e6e9b180-8bc8-4f84-b1f7-ec822b6f6560\") " pod="openshift-marketplace/certified-operators-cbr82" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.849238 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cb4rj" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.855329 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w5bmh" event={"ID":"c8fec47c-0cfe-45b1-9f45-5eba4c924359","Type":"ContainerStarted","Data":"b83315b02648ef024837609fe2205622b49128b5ab6704f1c692b2fd2a6f4613"} Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.869956 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4htrx" event={"ID":"eae91bc6-fbec-4bb5-81f7-254dc473427e","Type":"ContainerStarted","Data":"8e947b7705ed18be43bab67128f14105e6fdbcbe7b396f0f7cb1636ef3b11e6e"} Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.872802 4734 patch_prober.go:28] interesting pod/downloads-7954f5f757-htwjw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.872854 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-htwjw" podUID="d391f1fa-9bbe-478c-a1da-6ccb8f75f3c5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.886890 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ws6qt" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.906593 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6e9b180-8bc8-4f84-b1f7-ec822b6f6560-catalog-content\") pod \"certified-operators-cbr82\" (UID: \"e6e9b180-8bc8-4f84-b1f7-ec822b6f6560\") " pod="openshift-marketplace/certified-operators-cbr82" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.906651 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.906693 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmd7l\" (UniqueName: \"kubernetes.io/projected/e6e9b180-8bc8-4f84-b1f7-ec822b6f6560-kube-api-access-hmd7l\") pod \"certified-operators-cbr82\" (UID: \"e6e9b180-8bc8-4f84-b1f7-ec822b6f6560\") " pod="openshift-marketplace/certified-operators-cbr82" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.906729 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6e9b180-8bc8-4f84-b1f7-ec822b6f6560-utilities\") pod \"certified-operators-cbr82\" (UID: \"e6e9b180-8bc8-4f84-b1f7-ec822b6f6560\") " pod="openshift-marketplace/certified-operators-cbr82" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.907240 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6e9b180-8bc8-4f84-b1f7-ec822b6f6560-utilities\") pod \"certified-operators-cbr82\" (UID: \"e6e9b180-8bc8-4f84-b1f7-ec822b6f6560\") " pod="openshift-marketplace/certified-operators-cbr82" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.907591 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6e9b180-8bc8-4f84-b1f7-ec822b6f6560-catalog-content\") pod \"certified-operators-cbr82\" (UID: \"e6e9b180-8bc8-4f84-b1f7-ec822b6f6560\") " pod="openshift-marketplace/certified-operators-cbr82" Dec 05 23:22:05 crc kubenswrapper[4734]: E1205 23:22:05.907788 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:06.407763742 +0000 UTC m=+147.091168018 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.908465 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-67l49"] Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.918759 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-67l49" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.944430 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmd7l\" (UniqueName: \"kubernetes.io/projected/e6e9b180-8bc8-4f84-b1f7-ec822b6f6560-kube-api-access-hmd7l\") pod \"certified-operators-cbr82\" (UID: \"e6e9b180-8bc8-4f84-b1f7-ec822b6f6560\") " pod="openshift-marketplace/certified-operators-cbr82" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.955189 4734 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 05 23:22:05 crc kubenswrapper[4734]: I1205 23:22:05.971786 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-67l49"] Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.008643 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.009214 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15bf5615-0adc-46cd-8796-d419076acac7-catalog-content\") pod \"community-operators-67l49\" (UID: \"15bf5615-0adc-46cd-8796-d419076acac7\") " pod="openshift-marketplace/community-operators-67l49" Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.009335 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7hrp\" (UniqueName: \"kubernetes.io/projected/15bf5615-0adc-46cd-8796-d419076acac7-kube-api-access-f7hrp\") pod \"community-operators-67l49\" (UID: \"15bf5615-0adc-46cd-8796-d419076acac7\") " pod="openshift-marketplace/community-operators-67l49" Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.009623 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15bf5615-0adc-46cd-8796-d419076acac7-utilities\") pod \"community-operators-67l49\" (UID: \"15bf5615-0adc-46cd-8796-d419076acac7\") " pod="openshift-marketplace/community-operators-67l49" Dec 05 23:22:06 crc kubenswrapper[4734]: E1205 23:22:06.011496 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:06.51147521 +0000 UTC m=+147.194879486 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.082251 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cbr82" Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.111229 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.111280 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15bf5615-0adc-46cd-8796-d419076acac7-catalog-content\") pod \"community-operators-67l49\" (UID: \"15bf5615-0adc-46cd-8796-d419076acac7\") " pod="openshift-marketplace/community-operators-67l49" Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.111306 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7hrp\" (UniqueName: \"kubernetes.io/projected/15bf5615-0adc-46cd-8796-d419076acac7-kube-api-access-f7hrp\") pod \"community-operators-67l49\" (UID: \"15bf5615-0adc-46cd-8796-d419076acac7\") " pod="openshift-marketplace/community-operators-67l49" Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.111345 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.111364 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15bf5615-0adc-46cd-8796-d419076acac7-utilities\") pod \"community-operators-67l49\" (UID: \"15bf5615-0adc-46cd-8796-d419076acac7\") " pod="openshift-marketplace/community-operators-67l49" Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.111420 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.112219 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:22:06 crc kubenswrapper[4734]: E1205 23:22:06.112550 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:06.612513982 +0000 UTC m=+147.295918248 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.113759 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15bf5615-0adc-46cd-8796-d419076acac7-utilities\") pod \"community-operators-67l49\" (UID: \"15bf5615-0adc-46cd-8796-d419076acac7\") " pod="openshift-marketplace/community-operators-67l49" Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.113981 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15bf5615-0adc-46cd-8796-d419076acac7-catalog-content\") pod \"community-operators-67l49\" (UID: \"15bf5615-0adc-46cd-8796-d419076acac7\") " pod="openshift-marketplace/community-operators-67l49" Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.128671 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.173323 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7hrp\" (UniqueName: \"kubernetes.io/projected/15bf5615-0adc-46cd-8796-d419076acac7-kube-api-access-f7hrp\") pod \"community-operators-67l49\" (UID: \"15bf5615-0adc-46cd-8796-d419076acac7\") " pod="openshift-marketplace/community-operators-67l49" Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.212211 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.212588 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.212643 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:22:06 crc kubenswrapper[4734]: E1205 23:22:06.222708 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:06.722674038 +0000 UTC m=+147.406078314 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.230312 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.242505 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4jlxs" Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.243275 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.247943 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-67l49" Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.277692 4734 patch_prober.go:28] interesting pod/router-default-5444994796-2djpb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 23:22:06 crc kubenswrapper[4734]: [-]has-synced failed: reason withheld Dec 05 23:22:06 crc kubenswrapper[4734]: [+]process-running ok Dec 05 23:22:06 crc kubenswrapper[4734]: healthz check failed Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.277749 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2djpb" podUID="8251ea29-3180-4d6c-a6f7-6477bcd8ed6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.316188 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:06 crc kubenswrapper[4734]: E1205 23:22:06.316645 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:06.816628938 +0000 UTC m=+147.500033214 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.316886 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.333956 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.341870 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.419797 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:06 crc kubenswrapper[4734]: E1205 23:22:06.420866 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:06.920843378 +0000 UTC m=+147.604247654 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.421229 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hmv47"] Dec 05 23:22:06 crc kubenswrapper[4734]: W1205 23:22:06.482070 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e24c08e_fb74_4ae6_9c48_ae9653c964e8.slice/crio-f35e82570b9df67ea7b1c3dfa062d26ada21f2ad2f14e7b74cb995ba55ebce6f WatchSource:0}: Error finding container f35e82570b9df67ea7b1c3dfa062d26ada21f2ad2f14e7b74cb995ba55ebce6f: Status 404 returned error can't find the container with id f35e82570b9df67ea7b1c3dfa062d26ada21f2ad2f14e7b74cb995ba55ebce6f Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.521956 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:06 crc kubenswrapper[4734]: E1205 23:22:06.522422 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:07.022405734 +0000 UTC m=+147.705810020 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.592871 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cb4rj"] Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.626073 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:06 crc kubenswrapper[4734]: E1205 23:22:06.626429 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:07.126409979 +0000 UTC m=+147.809814245 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.728262 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cbr82"] Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.729457 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:06 crc kubenswrapper[4734]: E1205 23:22:06.730149 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 23:22:07.230123517 +0000 UTC m=+147.913527793 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ptxqw" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.776654 4734 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-05T23:22:05.955211897Z","Handler":null,"Name":""} Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.831483 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:06 crc kubenswrapper[4734]: E1205 23:22:06.831895 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 23:22:07.331873168 +0000 UTC m=+148.015277444 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.847387 4734 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.847436 4734 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.901933 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmv47" event={"ID":"7e24c08e-fb74-4ae6-9c48-ae9653c964e8","Type":"ContainerStarted","Data":"f35e82570b9df67ea7b1c3dfa062d26ada21f2ad2f14e7b74cb995ba55ebce6f"} Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.904146 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w5bmh" event={"ID":"c8fec47c-0cfe-45b1-9f45-5eba4c924359","Type":"ContainerStarted","Data":"7a3f7972cfcbd31564c781ba9a9d59fdf754746c45a990daa1b2c8fd3286f1cf"} Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.904937 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cb4rj" event={"ID":"597348be-fe32-4495-bb10-d152ed593e3e","Type":"ContainerStarted","Data":"8485d6fd69838d4f3797b3203ebb9dd871098c143a2e8ff2b9af878a1c3e1633"} Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.908030 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cbr82" event={"ID":"e6e9b180-8bc8-4f84-b1f7-ec822b6f6560","Type":"ContainerStarted","Data":"ebf3e81e7fcb0ee47bfecad9782be3dbcb5e1b8f5fb09825ab1423037466053d"} Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.914881 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-67l49"] Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.924045 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-txdvl" Dec 05 23:22:06 crc kubenswrapper[4734]: I1205 23:22:06.935078 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:07 crc kubenswrapper[4734]: W1205 23:22:07.040641 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-39a40e82fa8ef5c428632d68f9ca5091389946ebfaa132d57fcaa23a7c3d9fd8 WatchSource:0}: Error finding container 39a40e82fa8ef5c428632d68f9ca5091389946ebfaa132d57fcaa23a7c3d9fd8: Status 404 returned error can't find the container with id 39a40e82fa8ef5c428632d68f9ca5091389946ebfaa132d57fcaa23a7c3d9fd8 Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.052037 4734 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.052117 4734 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.124008 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ptxqw\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.149945 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.201778 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.203444 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.234988 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.238954 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.248171 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.294178 4734 patch_prober.go:28] interesting pod/router-default-5444994796-2djpb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 23:22:07 crc kubenswrapper[4734]: [-]has-synced failed: reason withheld Dec 05 23:22:07 crc kubenswrapper[4734]: [+]process-running ok Dec 05 23:22:07 crc kubenswrapper[4734]: healthz check failed Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.294824 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2djpb" podUID="8251ea29-3180-4d6c-a6f7-6477bcd8ed6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.302052 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kdt5b"] Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.303484 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kdt5b" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.306012 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.316757 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kdt5b"] Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.354769 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5bcz\" (UniqueName: \"kubernetes.io/projected/5040a4a1-0b01-4581-89a7-37186c3caebe-kube-api-access-t5bcz\") pod \"redhat-marketplace-kdt5b\" (UID: \"5040a4a1-0b01-4581-89a7-37186c3caebe\") " pod="openshift-marketplace/redhat-marketplace-kdt5b" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.354840 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5040a4a1-0b01-4581-89a7-37186c3caebe-catalog-content\") pod \"redhat-marketplace-kdt5b\" (UID: \"5040a4a1-0b01-4581-89a7-37186c3caebe\") " pod="openshift-marketplace/redhat-marketplace-kdt5b" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.354868 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5040a4a1-0b01-4581-89a7-37186c3caebe-utilities\") pod \"redhat-marketplace-kdt5b\" (UID: \"5040a4a1-0b01-4581-89a7-37186c3caebe\") " pod="openshift-marketplace/redhat-marketplace-kdt5b" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.456942 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5040a4a1-0b01-4581-89a7-37186c3caebe-utilities\") pod \"redhat-marketplace-kdt5b\" (UID: \"5040a4a1-0b01-4581-89a7-37186c3caebe\") " pod="openshift-marketplace/redhat-marketplace-kdt5b" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.457045 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5bcz\" (UniqueName: \"kubernetes.io/projected/5040a4a1-0b01-4581-89a7-37186c3caebe-kube-api-access-t5bcz\") pod \"redhat-marketplace-kdt5b\" (UID: \"5040a4a1-0b01-4581-89a7-37186c3caebe\") " pod="openshift-marketplace/redhat-marketplace-kdt5b" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.457102 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5040a4a1-0b01-4581-89a7-37186c3caebe-catalog-content\") pod \"redhat-marketplace-kdt5b\" (UID: \"5040a4a1-0b01-4581-89a7-37186c3caebe\") " pod="openshift-marketplace/redhat-marketplace-kdt5b" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.457803 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5040a4a1-0b01-4581-89a7-37186c3caebe-catalog-content\") pod \"redhat-marketplace-kdt5b\" (UID: \"5040a4a1-0b01-4581-89a7-37186c3caebe\") " pod="openshift-marketplace/redhat-marketplace-kdt5b" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.457845 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5040a4a1-0b01-4581-89a7-37186c3caebe-utilities\") pod \"redhat-marketplace-kdt5b\" (UID: \"5040a4a1-0b01-4581-89a7-37186c3caebe\") " pod="openshift-marketplace/redhat-marketplace-kdt5b" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.487203 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5bcz\" (UniqueName: \"kubernetes.io/projected/5040a4a1-0b01-4581-89a7-37186c3caebe-kube-api-access-t5bcz\") pod \"redhat-marketplace-kdt5b\" (UID: \"5040a4a1-0b01-4581-89a7-37186c3caebe\") " pod="openshift-marketplace/redhat-marketplace-kdt5b" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.532086 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kdt5b" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.597945 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.598797 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.601461 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.601898 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.639860 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.643548 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.659757 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9647e702-438f-4f79-ba2a-c1adab9cdb40-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9647e702-438f-4f79-ba2a-c1adab9cdb40\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.659895 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9647e702-438f-4f79-ba2a-c1adab9cdb40-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9647e702-438f-4f79-ba2a-c1adab9cdb40\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.699562 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ptxqw"] Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.709866 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k6qjj"] Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.714753 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k6qjj" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.721413 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k6qjj"] Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.761024 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f0ef57c-6339-4a66-9318-7e594c611080-utilities\") pod \"redhat-marketplace-k6qjj\" (UID: \"9f0ef57c-6339-4a66-9318-7e594c611080\") " pod="openshift-marketplace/redhat-marketplace-k6qjj" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.761095 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f0ef57c-6339-4a66-9318-7e594c611080-catalog-content\") pod \"redhat-marketplace-k6qjj\" (UID: \"9f0ef57c-6339-4a66-9318-7e594c611080\") " pod="openshift-marketplace/redhat-marketplace-k6qjj" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.761227 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9647e702-438f-4f79-ba2a-c1adab9cdb40-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9647e702-438f-4f79-ba2a-c1adab9cdb40\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.761304 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ct6q\" (UniqueName: \"kubernetes.io/projected/9f0ef57c-6339-4a66-9318-7e594c611080-kube-api-access-9ct6q\") pod \"redhat-marketplace-k6qjj\" (UID: \"9f0ef57c-6339-4a66-9318-7e594c611080\") " pod="openshift-marketplace/redhat-marketplace-k6qjj" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.761343 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9647e702-438f-4f79-ba2a-c1adab9cdb40-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9647e702-438f-4f79-ba2a-c1adab9cdb40\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.761805 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9647e702-438f-4f79-ba2a-c1adab9cdb40-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9647e702-438f-4f79-ba2a-c1adab9cdb40\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.780492 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9647e702-438f-4f79-ba2a-c1adab9cdb40-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9647e702-438f-4f79-ba2a-c1adab9cdb40\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.862640 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ct6q\" (UniqueName: \"kubernetes.io/projected/9f0ef57c-6339-4a66-9318-7e594c611080-kube-api-access-9ct6q\") pod \"redhat-marketplace-k6qjj\" (UID: \"9f0ef57c-6339-4a66-9318-7e594c611080\") " pod="openshift-marketplace/redhat-marketplace-k6qjj" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.862715 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f0ef57c-6339-4a66-9318-7e594c611080-utilities\") pod \"redhat-marketplace-k6qjj\" (UID: \"9f0ef57c-6339-4a66-9318-7e594c611080\") " pod="openshift-marketplace/redhat-marketplace-k6qjj" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.862733 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f0ef57c-6339-4a66-9318-7e594c611080-catalog-content\") pod \"redhat-marketplace-k6qjj\" (UID: \"9f0ef57c-6339-4a66-9318-7e594c611080\") " pod="openshift-marketplace/redhat-marketplace-k6qjj" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.863439 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f0ef57c-6339-4a66-9318-7e594c611080-catalog-content\") pod \"redhat-marketplace-k6qjj\" (UID: \"9f0ef57c-6339-4a66-9318-7e594c611080\") " pod="openshift-marketplace/redhat-marketplace-k6qjj" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.864589 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f0ef57c-6339-4a66-9318-7e594c611080-utilities\") pod \"redhat-marketplace-k6qjj\" (UID: \"9f0ef57c-6339-4a66-9318-7e594c611080\") " pod="openshift-marketplace/redhat-marketplace-k6qjj" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.887762 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ct6q\" (UniqueName: \"kubernetes.io/projected/9f0ef57c-6339-4a66-9318-7e594c611080-kube-api-access-9ct6q\") pod \"redhat-marketplace-k6qjj\" (UID: \"9f0ef57c-6339-4a66-9318-7e594c611080\") " pod="openshift-marketplace/redhat-marketplace-k6qjj" Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.915858 4734 generic.go:334] "Generic (PLEG): container finished" podID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" containerID="d5b8b8cb574d08e779955b6608ed0c030124c6ce2887ce6f86a9ec7d78f1b19e" exitCode=0 Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.915933 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cbr82" event={"ID":"e6e9b180-8bc8-4f84-b1f7-ec822b6f6560","Type":"ContainerDied","Data":"d5b8b8cb574d08e779955b6608ed0c030124c6ce2887ce6f86a9ec7d78f1b19e"} Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.918084 4734 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.932327 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d179d6b5105fde715ce4339ae7855d6171ab7b50206afdc95e4fd433a19e72b4"} Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.932366 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"74c4e31b312a78afb3b6b153fe36f8986df28ab73f53317cdc3be4b32e15a757"} Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.938038 4734 generic.go:334] "Generic (PLEG): container finished" podID="597348be-fe32-4495-bb10-d152ed593e3e" containerID="f4751cf451faedbcaa626ab80383373cfd0ca970206a2e2b2d17fcb0c313ea32" exitCode=0 Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.938120 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cb4rj" event={"ID":"597348be-fe32-4495-bb10-d152ed593e3e","Type":"ContainerDied","Data":"f4751cf451faedbcaa626ab80383373cfd0ca970206a2e2b2d17fcb0c313ea32"} Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.949581 4734 generic.go:334] "Generic (PLEG): container finished" podID="7e24c08e-fb74-4ae6-9c48-ae9653c964e8" containerID="5abb5e424fa45208e6807d339a109cdafafa143b061bf4664d1d324e20c734a4" exitCode=0 Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.949658 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmv47" event={"ID":"7e24c08e-fb74-4ae6-9c48-ae9653c964e8","Type":"ContainerDied","Data":"5abb5e424fa45208e6807d339a109cdafafa143b061bf4664d1d324e20c734a4"} Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.963930 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f2b2063045a755c32beb1840f32f2f77fa11c19c331fa9fc231be552f3241c52"} Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.963990 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"d87aa459b495b180fe30d2e7b8096079856ad24999747304ccde337f40c7f4ac"} Dec 05 23:22:07 crc kubenswrapper[4734]: I1205 23:22:07.976605 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.019856 4734 generic.go:334] "Generic (PLEG): container finished" podID="15bf5615-0adc-46cd-8796-d419076acac7" containerID="a7c33c398efa5f848e79e010010315d4c638989f3a5d70bbd187d83da830a16a" exitCode=0 Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.019972 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-67l49" event={"ID":"15bf5615-0adc-46cd-8796-d419076acac7","Type":"ContainerDied","Data":"a7c33c398efa5f848e79e010010315d4c638989f3a5d70bbd187d83da830a16a"} Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.020005 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-67l49" event={"ID":"15bf5615-0adc-46cd-8796-d419076acac7","Type":"ContainerStarted","Data":"b5d0cb9a1d47412c5a027563ad707bc45f330a172c608b781e43c928256c1454"} Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.023228 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kdt5b"] Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.039618 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k6qjj" Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.057871 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w5bmh" event={"ID":"c8fec47c-0cfe-45b1-9f45-5eba4c924359","Type":"ContainerStarted","Data":"7e51228f99e054c65f546e4652a236963d4f794d484334f731ef3d27e14331cc"} Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.060689 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" event={"ID":"f4f948a0-bcd5-4e9e-86ec-0429082dac44","Type":"ContainerStarted","Data":"f734123a9f2008a70e1b97f4c4481b3d5016d91517a8e6596f89a1cd984555aa"} Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.061107 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.091680 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"6a4de36f7e379d0d5a9baf7b0dd55dadf8098baffed79d14cc8789844dff3b6c"} Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.091764 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"39a40e82fa8ef5c428632d68f9ca5091389946ebfaa132d57fcaa23a7c3d9fd8"} Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.092877 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.109804 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-54qxz" Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.207032 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" podStartSLOduration=128.207004645 podStartE2EDuration="2m8.207004645s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:08.206072202 +0000 UTC m=+148.889476478" watchObservedRunningTime="2025-12-05 23:22:08.207004645 +0000 UTC m=+148.890408921" Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.207441 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-w5bmh" podStartSLOduration=12.207436116 podStartE2EDuration="12.207436116s" podCreationTimestamp="2025-12-05 23:21:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:08.16042727 +0000 UTC m=+148.843831546" watchObservedRunningTime="2025-12-05 23:22:08.207436116 +0000 UTC m=+148.890840392" Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.286420 4734 patch_prober.go:28] interesting pod/router-default-5444994796-2djpb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 23:22:08 crc kubenswrapper[4734]: [-]has-synced failed: reason withheld Dec 05 23:22:08 crc kubenswrapper[4734]: [+]process-running ok Dec 05 23:22:08 crc kubenswrapper[4734]: healthz check failed Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.286955 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2djpb" podUID="8251ea29-3180-4d6c-a6f7-6477bcd8ed6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.430548 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.547246 4734 patch_prober.go:28] interesting pod/downloads-7954f5f757-htwjw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.547307 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-htwjw" podUID="d391f1fa-9bbe-478c-a1da-6ccb8f75f3c5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.547559 4734 patch_prober.go:28] interesting pod/downloads-7954f5f757-htwjw container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.547742 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-htwjw" podUID="d391f1fa-9bbe-478c-a1da-6ccb8f75f3c5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.581998 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k6qjj"] Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.582165 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.582234 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.595692 4734 patch_prober.go:28] interesting pod/console-f9d7485db-5h9wr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.31:8443/health\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.595755 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-5h9wr" podUID="790e28b3-bfd6-40f2-8bd4-272fc91b9ffe" containerName="console" probeResult="failure" output="Get \"https://10.217.0.31:8443/health\": dial tcp 10.217.0.31:8443: connect: connection refused" Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.627589 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.627741 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.647013 4734 patch_prober.go:28] interesting pod/apiserver-76f77b778f-x67qn container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 23:22:08 crc kubenswrapper[4734]: [+]log ok Dec 05 23:22:08 crc kubenswrapper[4734]: [+]etcd ok Dec 05 23:22:08 crc kubenswrapper[4734]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 23:22:08 crc kubenswrapper[4734]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 23:22:08 crc kubenswrapper[4734]: [+]poststarthook/max-in-flight-filter ok Dec 05 23:22:08 crc kubenswrapper[4734]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 23:22:08 crc kubenswrapper[4734]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 05 23:22:08 crc kubenswrapper[4734]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 05 23:22:08 crc kubenswrapper[4734]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 05 23:22:08 crc kubenswrapper[4734]: [+]poststarthook/project.openshift.io-projectcache ok Dec 05 23:22:08 crc kubenswrapper[4734]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 05 23:22:08 crc kubenswrapper[4734]: [+]poststarthook/openshift.io-startinformers ok Dec 05 23:22:08 crc kubenswrapper[4734]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 05 23:22:08 crc kubenswrapper[4734]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 05 23:22:08 crc kubenswrapper[4734]: livez check failed Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.647102 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-x67qn" podUID="7adf273b-63fb-40fe-9d0d-fe467260565b" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.701221 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vj45t"] Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.703186 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vj45t" Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.707978 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.756567 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vj45t"] Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.908786 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ba0c803-1b80-4161-afa1-c9b6dc65ea00-catalog-content\") pod \"redhat-operators-vj45t\" (UID: \"7ba0c803-1b80-4161-afa1-c9b6dc65ea00\") " pod="openshift-marketplace/redhat-operators-vj45t" Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.908874 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ba0c803-1b80-4161-afa1-c9b6dc65ea00-utilities\") pod \"redhat-operators-vj45t\" (UID: \"7ba0c803-1b80-4161-afa1-c9b6dc65ea00\") " pod="openshift-marketplace/redhat-operators-vj45t" Dec 05 23:22:08 crc kubenswrapper[4734]: I1205 23:22:08.908953 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgxsd\" (UniqueName: \"kubernetes.io/projected/7ba0c803-1b80-4161-afa1-c9b6dc65ea00-kube-api-access-tgxsd\") pod \"redhat-operators-vj45t\" (UID: \"7ba0c803-1b80-4161-afa1-c9b6dc65ea00\") " pod="openshift-marketplace/redhat-operators-vj45t" Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.010894 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgxsd\" (UniqueName: \"kubernetes.io/projected/7ba0c803-1b80-4161-afa1-c9b6dc65ea00-kube-api-access-tgxsd\") pod \"redhat-operators-vj45t\" (UID: \"7ba0c803-1b80-4161-afa1-c9b6dc65ea00\") " pod="openshift-marketplace/redhat-operators-vj45t" Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.011131 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ba0c803-1b80-4161-afa1-c9b6dc65ea00-catalog-content\") pod \"redhat-operators-vj45t\" (UID: \"7ba0c803-1b80-4161-afa1-c9b6dc65ea00\") " pod="openshift-marketplace/redhat-operators-vj45t" Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.011200 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ba0c803-1b80-4161-afa1-c9b6dc65ea00-utilities\") pod \"redhat-operators-vj45t\" (UID: \"7ba0c803-1b80-4161-afa1-c9b6dc65ea00\") " pod="openshift-marketplace/redhat-operators-vj45t" Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.012225 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ba0c803-1b80-4161-afa1-c9b6dc65ea00-catalog-content\") pod \"redhat-operators-vj45t\" (UID: \"7ba0c803-1b80-4161-afa1-c9b6dc65ea00\") " pod="openshift-marketplace/redhat-operators-vj45t" Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.012645 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ba0c803-1b80-4161-afa1-c9b6dc65ea00-utilities\") pod \"redhat-operators-vj45t\" (UID: \"7ba0c803-1b80-4161-afa1-c9b6dc65ea00\") " pod="openshift-marketplace/redhat-operators-vj45t" Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.033393 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgxsd\" (UniqueName: \"kubernetes.io/projected/7ba0c803-1b80-4161-afa1-c9b6dc65ea00-kube-api-access-tgxsd\") pod \"redhat-operators-vj45t\" (UID: \"7ba0c803-1b80-4161-afa1-c9b6dc65ea00\") " pod="openshift-marketplace/redhat-operators-vj45t" Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.040980 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vj45t" Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.099882 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gnbkp"] Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.101271 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gnbkp" Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.102576 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gnbkp"] Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.113792 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4-catalog-content\") pod \"redhat-operators-gnbkp\" (UID: \"4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4\") " pod="openshift-marketplace/redhat-operators-gnbkp" Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.113873 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf8jt\" (UniqueName: \"kubernetes.io/projected/4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4-kube-api-access-bf8jt\") pod \"redhat-operators-gnbkp\" (UID: \"4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4\") " pod="openshift-marketplace/redhat-operators-gnbkp" Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.114178 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4-utilities\") pod \"redhat-operators-gnbkp\" (UID: \"4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4\") " pod="openshift-marketplace/redhat-operators-gnbkp" Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.123846 4734 generic.go:334] "Generic (PLEG): container finished" podID="5040a4a1-0b01-4581-89a7-37186c3caebe" containerID="126cf9aed4db121243f46854a2e07e9b4913773b2b0e42dc3d6ffd27cb4c3229" exitCode=0 Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.123978 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kdt5b" event={"ID":"5040a4a1-0b01-4581-89a7-37186c3caebe","Type":"ContainerDied","Data":"126cf9aed4db121243f46854a2e07e9b4913773b2b0e42dc3d6ffd27cb4c3229"} Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.124012 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kdt5b" event={"ID":"5040a4a1-0b01-4581-89a7-37186c3caebe","Type":"ContainerStarted","Data":"7c42ae00e0855109341d487c785732b1fc1a459cf8f62942fbbc49bb3cfc2fc2"} Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.128642 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" event={"ID":"f4f948a0-bcd5-4e9e-86ec-0429082dac44","Type":"ContainerStarted","Data":"e38a8d45db59594dbd6149cd98567240633a7e01642234d90197625ff6c83768"} Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.143452 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9647e702-438f-4f79-ba2a-c1adab9cdb40","Type":"ContainerStarted","Data":"7489a38b609784b7866847b2c13bea03edc6ee0db0c7b3caa28023beb1a70011"} Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.143512 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9647e702-438f-4f79-ba2a-c1adab9cdb40","Type":"ContainerStarted","Data":"925f8efc96f1e0f1911910b6b85ce1fde2fb636041ef68a84c5b1b7302bfdff7"} Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.153017 4734 generic.go:334] "Generic (PLEG): container finished" podID="2a20dbad-8352-4804-9c0e-a2b6108a0d1b" containerID="c2ab13668511b3efa65133e7ec2f85f1d91583ee811fb50b4e0a228eac2de9b8" exitCode=0 Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.153271 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-mfkvp" event={"ID":"2a20dbad-8352-4804-9c0e-a2b6108a0d1b","Type":"ContainerDied","Data":"c2ab13668511b3efa65133e7ec2f85f1d91583ee811fb50b4e0a228eac2de9b8"} Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.190418 4734 generic.go:334] "Generic (PLEG): container finished" podID="9f0ef57c-6339-4a66-9318-7e594c611080" containerID="87a98d7ad0d34bc09562daa39f25e6f8874358b78261fea21c41bfa98c2ea386" exitCode=0 Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.190761 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k6qjj" event={"ID":"9f0ef57c-6339-4a66-9318-7e594c611080","Type":"ContainerDied","Data":"87a98d7ad0d34bc09562daa39f25e6f8874358b78261fea21c41bfa98c2ea386"} Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.204288 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k6qjj" event={"ID":"9f0ef57c-6339-4a66-9318-7e594c611080","Type":"ContainerStarted","Data":"2ad408f4b63806e3458fc92a15f61be5aa1cad05e76069171e0bcd192735ca15"} Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.218931 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.218903438 podStartE2EDuration="2.218903438s" podCreationTimestamp="2025-12-05 23:22:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:09.200967108 +0000 UTC m=+149.884371404" watchObservedRunningTime="2025-12-05 23:22:09.218903438 +0000 UTC m=+149.902307714" Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.219700 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf8jt\" (UniqueName: \"kubernetes.io/projected/4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4-kube-api-access-bf8jt\") pod \"redhat-operators-gnbkp\" (UID: \"4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4\") " pod="openshift-marketplace/redhat-operators-gnbkp" Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.220246 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4-utilities\") pod \"redhat-operators-gnbkp\" (UID: \"4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4\") " pod="openshift-marketplace/redhat-operators-gnbkp" Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.220298 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4-catalog-content\") pod \"redhat-operators-gnbkp\" (UID: \"4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4\") " pod="openshift-marketplace/redhat-operators-gnbkp" Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.221557 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4-utilities\") pod \"redhat-operators-gnbkp\" (UID: \"4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4\") " pod="openshift-marketplace/redhat-operators-gnbkp" Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.228182 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4-catalog-content\") pod \"redhat-operators-gnbkp\" (UID: \"4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4\") " pod="openshift-marketplace/redhat-operators-gnbkp" Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.248576 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf8jt\" (UniqueName: \"kubernetes.io/projected/4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4-kube-api-access-bf8jt\") pod \"redhat-operators-gnbkp\" (UID: \"4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4\") " pod="openshift-marketplace/redhat-operators-gnbkp" Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.273414 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-2djpb" Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.284328 4734 patch_prober.go:28] interesting pod/router-default-5444994796-2djpb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 23:22:09 crc kubenswrapper[4734]: [-]has-synced failed: reason withheld Dec 05 23:22:09 crc kubenswrapper[4734]: [+]process-running ok Dec 05 23:22:09 crc kubenswrapper[4734]: healthz check failed Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.284406 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2djpb" podUID="8251ea29-3180-4d6c-a6f7-6477bcd8ed6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.422901 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gnbkp" Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.452089 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vj45t"] Dec 05 23:22:09 crc kubenswrapper[4734]: I1205 23:22:09.949206 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gnbkp"] Dec 05 23:22:09 crc kubenswrapper[4734]: W1205 23:22:09.963539 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b7ae74a_0552_4e8b_9ce4_b8b9e5f389b4.slice/crio-f5e26c5781dac26b5b7baf898b86fe72efd6beed46b91c0f5a739d90e755820d WatchSource:0}: Error finding container f5e26c5781dac26b5b7baf898b86fe72efd6beed46b91c0f5a739d90e755820d: Status 404 returned error can't find the container with id f5e26c5781dac26b5b7baf898b86fe72efd6beed46b91c0f5a739d90e755820d Dec 05 23:22:10 crc kubenswrapper[4734]: I1205 23:22:10.230628 4734 generic.go:334] "Generic (PLEG): container finished" podID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" containerID="29e391a7080a1b0cf8c0ed2623b9ce8b1b62511013733cee8112ccf6ede9e797" exitCode=0 Dec 05 23:22:10 crc kubenswrapper[4734]: I1205 23:22:10.230718 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vj45t" event={"ID":"7ba0c803-1b80-4161-afa1-c9b6dc65ea00","Type":"ContainerDied","Data":"29e391a7080a1b0cf8c0ed2623b9ce8b1b62511013733cee8112ccf6ede9e797"} Dec 05 23:22:10 crc kubenswrapper[4734]: I1205 23:22:10.230749 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vj45t" event={"ID":"7ba0c803-1b80-4161-afa1-c9b6dc65ea00","Type":"ContainerStarted","Data":"2eceea83ef4bedf76d62f59e6c411372ef92f8088b2d055aaa79007d15228e12"} Dec 05 23:22:10 crc kubenswrapper[4734]: I1205 23:22:10.250097 4734 generic.go:334] "Generic (PLEG): container finished" podID="9647e702-438f-4f79-ba2a-c1adab9cdb40" containerID="7489a38b609784b7866847b2c13bea03edc6ee0db0c7b3caa28023beb1a70011" exitCode=0 Dec 05 23:22:10 crc kubenswrapper[4734]: I1205 23:22:10.250191 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9647e702-438f-4f79-ba2a-c1adab9cdb40","Type":"ContainerDied","Data":"7489a38b609784b7866847b2c13bea03edc6ee0db0c7b3caa28023beb1a70011"} Dec 05 23:22:10 crc kubenswrapper[4734]: I1205 23:22:10.261684 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gnbkp" event={"ID":"4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4","Type":"ContainerStarted","Data":"f5e26c5781dac26b5b7baf898b86fe72efd6beed46b91c0f5a739d90e755820d"} Dec 05 23:22:10 crc kubenswrapper[4734]: I1205 23:22:10.275788 4734 patch_prober.go:28] interesting pod/router-default-5444994796-2djpb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 23:22:10 crc kubenswrapper[4734]: [-]has-synced failed: reason withheld Dec 05 23:22:10 crc kubenswrapper[4734]: [+]process-running ok Dec 05 23:22:10 crc kubenswrapper[4734]: healthz check failed Dec 05 23:22:10 crc kubenswrapper[4734]: I1205 23:22:10.275894 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2djpb" podUID="8251ea29-3180-4d6c-a6f7-6477bcd8ed6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 23:22:10 crc kubenswrapper[4734]: I1205 23:22:10.583479 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-mfkvp" Dec 05 23:22:10 crc kubenswrapper[4734]: I1205 23:22:10.773957 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a20dbad-8352-4804-9c0e-a2b6108a0d1b-config-volume\") pod \"2a20dbad-8352-4804-9c0e-a2b6108a0d1b\" (UID: \"2a20dbad-8352-4804-9c0e-a2b6108a0d1b\") " Dec 05 23:22:10 crc kubenswrapper[4734]: I1205 23:22:10.774186 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a20dbad-8352-4804-9c0e-a2b6108a0d1b-secret-volume\") pod \"2a20dbad-8352-4804-9c0e-a2b6108a0d1b\" (UID: \"2a20dbad-8352-4804-9c0e-a2b6108a0d1b\") " Dec 05 23:22:10 crc kubenswrapper[4734]: I1205 23:22:10.774303 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd8ws\" (UniqueName: \"kubernetes.io/projected/2a20dbad-8352-4804-9c0e-a2b6108a0d1b-kube-api-access-gd8ws\") pod \"2a20dbad-8352-4804-9c0e-a2b6108a0d1b\" (UID: \"2a20dbad-8352-4804-9c0e-a2b6108a0d1b\") " Dec 05 23:22:10 crc kubenswrapper[4734]: I1205 23:22:10.774862 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a20dbad-8352-4804-9c0e-a2b6108a0d1b-config-volume" (OuterVolumeSpecName: "config-volume") pod "2a20dbad-8352-4804-9c0e-a2b6108a0d1b" (UID: "2a20dbad-8352-4804-9c0e-a2b6108a0d1b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:22:10 crc kubenswrapper[4734]: I1205 23:22:10.775313 4734 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a20dbad-8352-4804-9c0e-a2b6108a0d1b-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 23:22:10 crc kubenswrapper[4734]: I1205 23:22:10.783678 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a20dbad-8352-4804-9c0e-a2b6108a0d1b-kube-api-access-gd8ws" (OuterVolumeSpecName: "kube-api-access-gd8ws") pod "2a20dbad-8352-4804-9c0e-a2b6108a0d1b" (UID: "2a20dbad-8352-4804-9c0e-a2b6108a0d1b"). InnerVolumeSpecName "kube-api-access-gd8ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:22:10 crc kubenswrapper[4734]: I1205 23:22:10.807485 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a20dbad-8352-4804-9c0e-a2b6108a0d1b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2a20dbad-8352-4804-9c0e-a2b6108a0d1b" (UID: "2a20dbad-8352-4804-9c0e-a2b6108a0d1b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:22:10 crc kubenswrapper[4734]: I1205 23:22:10.876448 4734 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a20dbad-8352-4804-9c0e-a2b6108a0d1b-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 23:22:10 crc kubenswrapper[4734]: I1205 23:22:10.876491 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd8ws\" (UniqueName: \"kubernetes.io/projected/2a20dbad-8352-4804-9c0e-a2b6108a0d1b-kube-api-access-gd8ws\") on node \"crc\" DevicePath \"\"" Dec 05 23:22:11 crc kubenswrapper[4734]: I1205 23:22:11.272669 4734 patch_prober.go:28] interesting pod/router-default-5444994796-2djpb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 23:22:11 crc kubenswrapper[4734]: [-]has-synced failed: reason withheld Dec 05 23:22:11 crc kubenswrapper[4734]: [+]process-running ok Dec 05 23:22:11 crc kubenswrapper[4734]: healthz check failed Dec 05 23:22:11 crc kubenswrapper[4734]: I1205 23:22:11.272734 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2djpb" podUID="8251ea29-3180-4d6c-a6f7-6477bcd8ed6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 23:22:11 crc kubenswrapper[4734]: I1205 23:22:11.287752 4734 generic.go:334] "Generic (PLEG): container finished" podID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" containerID="7296522c020efa2b5dc85ed3aae2059722a3c0d9f8bb93314cacd8bb82249cdb" exitCode=0 Dec 05 23:22:11 crc kubenswrapper[4734]: I1205 23:22:11.287835 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gnbkp" event={"ID":"4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4","Type":"ContainerDied","Data":"7296522c020efa2b5dc85ed3aae2059722a3c0d9f8bb93314cacd8bb82249cdb"} Dec 05 23:22:11 crc kubenswrapper[4734]: I1205 23:22:11.291701 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-mfkvp" event={"ID":"2a20dbad-8352-4804-9c0e-a2b6108a0d1b","Type":"ContainerDied","Data":"14f868430dc14e33bceff0e87a41131a7790986ff959ff99aef89a3e5c3f1e73"} Dec 05 23:22:11 crc kubenswrapper[4734]: I1205 23:22:11.291740 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14f868430dc14e33bceff0e87a41131a7790986ff959ff99aef89a3e5c3f1e73" Dec 05 23:22:11 crc kubenswrapper[4734]: I1205 23:22:11.291779 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416275-mfkvp" Dec 05 23:22:11 crc kubenswrapper[4734]: I1205 23:22:11.552871 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 23:22:11 crc kubenswrapper[4734]: E1205 23:22:11.553124 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a20dbad-8352-4804-9c0e-a2b6108a0d1b" containerName="collect-profiles" Dec 05 23:22:11 crc kubenswrapper[4734]: I1205 23:22:11.553138 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a20dbad-8352-4804-9c0e-a2b6108a0d1b" containerName="collect-profiles" Dec 05 23:22:11 crc kubenswrapper[4734]: I1205 23:22:11.553252 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a20dbad-8352-4804-9c0e-a2b6108a0d1b" containerName="collect-profiles" Dec 05 23:22:11 crc kubenswrapper[4734]: I1205 23:22:11.553690 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 23:22:11 crc kubenswrapper[4734]: I1205 23:22:11.564767 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 23:22:11 crc kubenswrapper[4734]: I1205 23:22:11.565136 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 05 23:22:11 crc kubenswrapper[4734]: I1205 23:22:11.565394 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 05 23:22:11 crc kubenswrapper[4734]: I1205 23:22:11.704031 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19297f80-7c38-47dc-8ebd-c375cbae766d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"19297f80-7c38-47dc-8ebd-c375cbae766d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 23:22:11 crc kubenswrapper[4734]: I1205 23:22:11.704584 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/19297f80-7c38-47dc-8ebd-c375cbae766d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"19297f80-7c38-47dc-8ebd-c375cbae766d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 23:22:11 crc kubenswrapper[4734]: I1205 23:22:11.781413 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 23:22:11 crc kubenswrapper[4734]: I1205 23:22:11.806199 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19297f80-7c38-47dc-8ebd-c375cbae766d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"19297f80-7c38-47dc-8ebd-c375cbae766d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 23:22:11 crc kubenswrapper[4734]: I1205 23:22:11.806286 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/19297f80-7c38-47dc-8ebd-c375cbae766d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"19297f80-7c38-47dc-8ebd-c375cbae766d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 23:22:11 crc kubenswrapper[4734]: I1205 23:22:11.806419 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/19297f80-7c38-47dc-8ebd-c375cbae766d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"19297f80-7c38-47dc-8ebd-c375cbae766d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 23:22:11 crc kubenswrapper[4734]: I1205 23:22:11.826260 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19297f80-7c38-47dc-8ebd-c375cbae766d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"19297f80-7c38-47dc-8ebd-c375cbae766d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 23:22:11 crc kubenswrapper[4734]: I1205 23:22:11.888428 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 23:22:11 crc kubenswrapper[4734]: I1205 23:22:11.907711 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9647e702-438f-4f79-ba2a-c1adab9cdb40-kube-api-access\") pod \"9647e702-438f-4f79-ba2a-c1adab9cdb40\" (UID: \"9647e702-438f-4f79-ba2a-c1adab9cdb40\") " Dec 05 23:22:11 crc kubenswrapper[4734]: I1205 23:22:11.907824 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9647e702-438f-4f79-ba2a-c1adab9cdb40-kubelet-dir\") pod \"9647e702-438f-4f79-ba2a-c1adab9cdb40\" (UID: \"9647e702-438f-4f79-ba2a-c1adab9cdb40\") " Dec 05 23:22:11 crc kubenswrapper[4734]: I1205 23:22:11.908088 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9647e702-438f-4f79-ba2a-c1adab9cdb40-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9647e702-438f-4f79-ba2a-c1adab9cdb40" (UID: "9647e702-438f-4f79-ba2a-c1adab9cdb40"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:22:11 crc kubenswrapper[4734]: I1205 23:22:11.911030 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9647e702-438f-4f79-ba2a-c1adab9cdb40-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9647e702-438f-4f79-ba2a-c1adab9cdb40" (UID: "9647e702-438f-4f79-ba2a-c1adab9cdb40"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:22:12 crc kubenswrapper[4734]: I1205 23:22:12.009054 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9647e702-438f-4f79-ba2a-c1adab9cdb40-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 23:22:12 crc kubenswrapper[4734]: I1205 23:22:12.009137 4734 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9647e702-438f-4f79-ba2a-c1adab9cdb40-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 23:22:12 crc kubenswrapper[4734]: I1205 23:22:12.273743 4734 patch_prober.go:28] interesting pod/router-default-5444994796-2djpb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 23:22:12 crc kubenswrapper[4734]: [-]has-synced failed: reason withheld Dec 05 23:22:12 crc kubenswrapper[4734]: [+]process-running ok Dec 05 23:22:12 crc kubenswrapper[4734]: healthz check failed Dec 05 23:22:12 crc kubenswrapper[4734]: I1205 23:22:12.274122 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2djpb" podUID="8251ea29-3180-4d6c-a6f7-6477bcd8ed6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 23:22:12 crc kubenswrapper[4734]: I1205 23:22:12.322570 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9647e702-438f-4f79-ba2a-c1adab9cdb40","Type":"ContainerDied","Data":"925f8efc96f1e0f1911910b6b85ce1fde2fb636041ef68a84c5b1b7302bfdff7"} Dec 05 23:22:12 crc kubenswrapper[4734]: I1205 23:22:12.322631 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="925f8efc96f1e0f1911910b6b85ce1fde2fb636041ef68a84c5b1b7302bfdff7" Dec 05 23:22:12 crc kubenswrapper[4734]: I1205 23:22:12.322627 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 23:22:12 crc kubenswrapper[4734]: I1205 23:22:12.448835 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 23:22:12 crc kubenswrapper[4734]: W1205 23:22:12.480587 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod19297f80_7c38_47dc_8ebd_c375cbae766d.slice/crio-ff130225987298fa11d61a0e9b6c169664fae45a6499121b8492eff97db14f5d WatchSource:0}: Error finding container ff130225987298fa11d61a0e9b6c169664fae45a6499121b8492eff97db14f5d: Status 404 returned error can't find the container with id ff130225987298fa11d61a0e9b6c169664fae45a6499121b8492eff97db14f5d Dec 05 23:22:13 crc kubenswrapper[4734]: I1205 23:22:13.275191 4734 patch_prober.go:28] interesting pod/router-default-5444994796-2djpb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 23:22:13 crc kubenswrapper[4734]: [-]has-synced failed: reason withheld Dec 05 23:22:13 crc kubenswrapper[4734]: [+]process-running ok Dec 05 23:22:13 crc kubenswrapper[4734]: healthz check failed Dec 05 23:22:13 crc kubenswrapper[4734]: I1205 23:22:13.275659 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2djpb" podUID="8251ea29-3180-4d6c-a6f7-6477bcd8ed6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 23:22:13 crc kubenswrapper[4734]: I1205 23:22:13.338988 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"19297f80-7c38-47dc-8ebd-c375cbae766d","Type":"ContainerStarted","Data":"ff130225987298fa11d61a0e9b6c169664fae45a6499121b8492eff97db14f5d"} Dec 05 23:22:13 crc kubenswrapper[4734]: I1205 23:22:13.633216 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:22:13 crc kubenswrapper[4734]: I1205 23:22:13.638048 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-x67qn" Dec 05 23:22:14 crc kubenswrapper[4734]: I1205 23:22:14.276262 4734 patch_prober.go:28] interesting pod/router-default-5444994796-2djpb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 23:22:14 crc kubenswrapper[4734]: [-]has-synced failed: reason withheld Dec 05 23:22:14 crc kubenswrapper[4734]: [+]process-running ok Dec 05 23:22:14 crc kubenswrapper[4734]: healthz check failed Dec 05 23:22:14 crc kubenswrapper[4734]: I1205 23:22:14.276359 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2djpb" podUID="8251ea29-3180-4d6c-a6f7-6477bcd8ed6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 23:22:14 crc kubenswrapper[4734]: I1205 23:22:14.362596 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"19297f80-7c38-47dc-8ebd-c375cbae766d","Type":"ContainerStarted","Data":"d20213148e1c834e27ade966c7e4cee6b851cee5b67226a4753f2cd8d646c7f7"} Dec 05 23:22:14 crc kubenswrapper[4734]: I1205 23:22:14.391509 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.391433211 podStartE2EDuration="3.391433211s" podCreationTimestamp="2025-12-05 23:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:14.389328478 +0000 UTC m=+155.072732754" watchObservedRunningTime="2025-12-05 23:22:14.391433211 +0000 UTC m=+155.074837487" Dec 05 23:22:14 crc kubenswrapper[4734]: I1205 23:22:14.407049 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4htrx" Dec 05 23:22:15 crc kubenswrapper[4734]: I1205 23:22:15.273722 4734 patch_prober.go:28] interesting pod/router-default-5444994796-2djpb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 23:22:15 crc kubenswrapper[4734]: [-]has-synced failed: reason withheld Dec 05 23:22:15 crc kubenswrapper[4734]: [+]process-running ok Dec 05 23:22:15 crc kubenswrapper[4734]: healthz check failed Dec 05 23:22:15 crc kubenswrapper[4734]: I1205 23:22:15.274224 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2djpb" podUID="8251ea29-3180-4d6c-a6f7-6477bcd8ed6f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 23:22:15 crc kubenswrapper[4734]: I1205 23:22:15.404831 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"19297f80-7c38-47dc-8ebd-c375cbae766d","Type":"ContainerDied","Data":"d20213148e1c834e27ade966c7e4cee6b851cee5b67226a4753f2cd8d646c7f7"} Dec 05 23:22:15 crc kubenswrapper[4734]: I1205 23:22:15.407311 4734 generic.go:334] "Generic (PLEG): container finished" podID="19297f80-7c38-47dc-8ebd-c375cbae766d" containerID="d20213148e1c834e27ade966c7e4cee6b851cee5b67226a4753f2cd8d646c7f7" exitCode=0 Dec 05 23:22:16 crc kubenswrapper[4734]: I1205 23:22:16.457509 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-2djpb" Dec 05 23:22:16 crc kubenswrapper[4734]: I1205 23:22:16.461235 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-2djpb" Dec 05 23:22:18 crc kubenswrapper[4734]: I1205 23:22:18.546869 4734 patch_prober.go:28] interesting pod/downloads-7954f5f757-htwjw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 05 23:22:18 crc kubenswrapper[4734]: I1205 23:22:18.547350 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-htwjw" podUID="d391f1fa-9bbe-478c-a1da-6ccb8f75f3c5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 05 23:22:18 crc kubenswrapper[4734]: I1205 23:22:18.546880 4734 patch_prober.go:28] interesting pod/downloads-7954f5f757-htwjw container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 05 23:22:18 crc kubenswrapper[4734]: I1205 23:22:18.547828 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-htwjw" podUID="d391f1fa-9bbe-478c-a1da-6ccb8f75f3c5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 05 23:22:18 crc kubenswrapper[4734]: I1205 23:22:18.586653 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:22:18 crc kubenswrapper[4734]: I1205 23:22:18.590143 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:22:20 crc kubenswrapper[4734]: I1205 23:22:20.446540 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:22:20 crc kubenswrapper[4734]: I1205 23:22:20.446967 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:22:22 crc kubenswrapper[4734]: I1205 23:22:22.860544 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/641af4fe-dd54-4118-8985-d37a03d64f79-metrics-certs\") pod \"network-metrics-daemon-l6r6g\" (UID: \"641af4fe-dd54-4118-8985-d37a03d64f79\") " pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:22:22 crc kubenswrapper[4734]: I1205 23:22:22.876661 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/641af4fe-dd54-4118-8985-d37a03d64f79-metrics-certs\") pod \"network-metrics-daemon-l6r6g\" (UID: \"641af4fe-dd54-4118-8985-d37a03d64f79\") " pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:22:23 crc kubenswrapper[4734]: I1205 23:22:23.122910 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6r6g" Dec 05 23:22:23 crc kubenswrapper[4734]: I1205 23:22:23.756062 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 23:22:23 crc kubenswrapper[4734]: I1205 23:22:23.880500 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/19297f80-7c38-47dc-8ebd-c375cbae766d-kubelet-dir\") pod \"19297f80-7c38-47dc-8ebd-c375cbae766d\" (UID: \"19297f80-7c38-47dc-8ebd-c375cbae766d\") " Dec 05 23:22:23 crc kubenswrapper[4734]: I1205 23:22:23.880668 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19297f80-7c38-47dc-8ebd-c375cbae766d-kube-api-access\") pod \"19297f80-7c38-47dc-8ebd-c375cbae766d\" (UID: \"19297f80-7c38-47dc-8ebd-c375cbae766d\") " Dec 05 23:22:23 crc kubenswrapper[4734]: I1205 23:22:23.880713 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19297f80-7c38-47dc-8ebd-c375cbae766d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "19297f80-7c38-47dc-8ebd-c375cbae766d" (UID: "19297f80-7c38-47dc-8ebd-c375cbae766d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:22:23 crc kubenswrapper[4734]: I1205 23:22:23.880947 4734 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/19297f80-7c38-47dc-8ebd-c375cbae766d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 23:22:23 crc kubenswrapper[4734]: I1205 23:22:23.904577 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19297f80-7c38-47dc-8ebd-c375cbae766d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "19297f80-7c38-47dc-8ebd-c375cbae766d" (UID: "19297f80-7c38-47dc-8ebd-c375cbae766d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:22:23 crc kubenswrapper[4734]: I1205 23:22:23.981924 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19297f80-7c38-47dc-8ebd-c375cbae766d-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 23:22:24 crc kubenswrapper[4734]: I1205 23:22:24.491909 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"19297f80-7c38-47dc-8ebd-c375cbae766d","Type":"ContainerDied","Data":"ff130225987298fa11d61a0e9b6c169664fae45a6499121b8492eff97db14f5d"} Dec 05 23:22:24 crc kubenswrapper[4734]: I1205 23:22:24.491973 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff130225987298fa11d61a0e9b6c169664fae45a6499121b8492eff97db14f5d" Dec 05 23:22:24 crc kubenswrapper[4734]: I1205 23:22:24.492050 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 23:22:27 crc kubenswrapper[4734]: I1205 23:22:27.248352 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:22:28 crc kubenswrapper[4734]: I1205 23:22:28.560045 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-htwjw" Dec 05 23:22:39 crc kubenswrapper[4734]: E1205 23:22:39.193362 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 05 23:22:39 crc kubenswrapper[4734]: E1205 23:22:39.194103 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9ct6q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-k6qjj_openshift-marketplace(9f0ef57c-6339-4a66-9318-7e594c611080): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 23:22:39 crc kubenswrapper[4734]: E1205 23:22:39.195320 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-k6qjj" podUID="9f0ef57c-6339-4a66-9318-7e594c611080" Dec 05 23:22:39 crc kubenswrapper[4734]: I1205 23:22:39.364972 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tsb97" Dec 05 23:22:44 crc kubenswrapper[4734]: E1205 23:22:44.061399 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-k6qjj" podUID="9f0ef57c-6339-4a66-9318-7e594c611080" Dec 05 23:22:44 crc kubenswrapper[4734]: E1205 23:22:44.940105 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 05 23:22:44 crc kubenswrapper[4734]: E1205 23:22:44.940733 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8bsvp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-cb4rj_openshift-marketplace(597348be-fe32-4495-bb10-d152ed593e3e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 23:22:44 crc kubenswrapper[4734]: E1205 23:22:44.943376 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-cb4rj" podUID="597348be-fe32-4495-bb10-d152ed593e3e" Dec 05 23:22:44 crc kubenswrapper[4734]: E1205 23:22:44.963237 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 05 23:22:44 crc kubenswrapper[4734]: E1205 23:22:44.963482 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f7hrp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-67l49_openshift-marketplace(15bf5615-0adc-46cd-8796-d419076acac7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 23:22:44 crc kubenswrapper[4734]: E1205 23:22:44.964673 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-67l49" podUID="15bf5615-0adc-46cd-8796-d419076acac7" Dec 05 23:22:45 crc kubenswrapper[4734]: I1205 23:22:45.351895 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 23:22:45 crc kubenswrapper[4734]: E1205 23:22:45.352264 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9647e702-438f-4f79-ba2a-c1adab9cdb40" containerName="pruner" Dec 05 23:22:45 crc kubenswrapper[4734]: I1205 23:22:45.352283 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="9647e702-438f-4f79-ba2a-c1adab9cdb40" containerName="pruner" Dec 05 23:22:45 crc kubenswrapper[4734]: E1205 23:22:45.352297 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19297f80-7c38-47dc-8ebd-c375cbae766d" containerName="pruner" Dec 05 23:22:45 crc kubenswrapper[4734]: I1205 23:22:45.352305 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="19297f80-7c38-47dc-8ebd-c375cbae766d" containerName="pruner" Dec 05 23:22:45 crc kubenswrapper[4734]: I1205 23:22:45.352452 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="19297f80-7c38-47dc-8ebd-c375cbae766d" containerName="pruner" Dec 05 23:22:45 crc kubenswrapper[4734]: I1205 23:22:45.352490 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="9647e702-438f-4f79-ba2a-c1adab9cdb40" containerName="pruner" Dec 05 23:22:45 crc kubenswrapper[4734]: I1205 23:22:45.353251 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 23:22:45 crc kubenswrapper[4734]: I1205 23:22:45.361345 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 05 23:22:45 crc kubenswrapper[4734]: I1205 23:22:45.363574 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 05 23:22:45 crc kubenswrapper[4734]: I1205 23:22:45.366757 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 23:22:45 crc kubenswrapper[4734]: I1205 23:22:45.423335 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ddfefb46-d158-4b10-8634-1100abebd2d6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ddfefb46-d158-4b10-8634-1100abebd2d6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 23:22:45 crc kubenswrapper[4734]: I1205 23:22:45.423417 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ddfefb46-d158-4b10-8634-1100abebd2d6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ddfefb46-d158-4b10-8634-1100abebd2d6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 23:22:45 crc kubenswrapper[4734]: I1205 23:22:45.524565 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ddfefb46-d158-4b10-8634-1100abebd2d6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ddfefb46-d158-4b10-8634-1100abebd2d6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 23:22:45 crc kubenswrapper[4734]: I1205 23:22:45.524669 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ddfefb46-d158-4b10-8634-1100abebd2d6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ddfefb46-d158-4b10-8634-1100abebd2d6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 23:22:45 crc kubenswrapper[4734]: I1205 23:22:45.524794 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ddfefb46-d158-4b10-8634-1100abebd2d6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ddfefb46-d158-4b10-8634-1100abebd2d6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 23:22:45 crc kubenswrapper[4734]: I1205 23:22:45.561964 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ddfefb46-d158-4b10-8634-1100abebd2d6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ddfefb46-d158-4b10-8634-1100abebd2d6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 23:22:45 crc kubenswrapper[4734]: I1205 23:22:45.689995 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 23:22:46 crc kubenswrapper[4734]: I1205 23:22:46.347502 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 23:22:48 crc kubenswrapper[4734]: E1205 23:22:48.607851 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-67l49" podUID="15bf5615-0adc-46cd-8796-d419076acac7" Dec 05 23:22:48 crc kubenswrapper[4734]: E1205 23:22:48.607916 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-cb4rj" podUID="597348be-fe32-4495-bb10-d152ed593e3e" Dec 05 23:22:48 crc kubenswrapper[4734]: E1205 23:22:48.680489 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 05 23:22:48 crc kubenswrapper[4734]: E1205 23:22:48.680754 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bf8jt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-gnbkp_openshift-marketplace(4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 23:22:48 crc kubenswrapper[4734]: E1205 23:22:48.682138 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-gnbkp" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" Dec 05 23:22:50 crc kubenswrapper[4734]: I1205 23:22:50.445007 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:22:50 crc kubenswrapper[4734]: I1205 23:22:50.445491 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:22:50 crc kubenswrapper[4734]: I1205 23:22:50.557481 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 23:22:50 crc kubenswrapper[4734]: I1205 23:22:50.559067 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 23:22:50 crc kubenswrapper[4734]: I1205 23:22:50.567096 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 23:22:50 crc kubenswrapper[4734]: I1205 23:22:50.604304 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/82e6bed1-c7a7-4b50-af1e-68415379c41e-kube-api-access\") pod \"installer-9-crc\" (UID: \"82e6bed1-c7a7-4b50-af1e-68415379c41e\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 23:22:50 crc kubenswrapper[4734]: I1205 23:22:50.604643 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/82e6bed1-c7a7-4b50-af1e-68415379c41e-var-lock\") pod \"installer-9-crc\" (UID: \"82e6bed1-c7a7-4b50-af1e-68415379c41e\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 23:22:50 crc kubenswrapper[4734]: I1205 23:22:50.604753 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/82e6bed1-c7a7-4b50-af1e-68415379c41e-kubelet-dir\") pod \"installer-9-crc\" (UID: \"82e6bed1-c7a7-4b50-af1e-68415379c41e\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 23:22:50 crc kubenswrapper[4734]: E1205 23:22:50.622074 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-gnbkp" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" Dec 05 23:22:50 crc kubenswrapper[4734]: I1205 23:22:50.705797 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/82e6bed1-c7a7-4b50-af1e-68415379c41e-kube-api-access\") pod \"installer-9-crc\" (UID: \"82e6bed1-c7a7-4b50-af1e-68415379c41e\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 23:22:50 crc kubenswrapper[4734]: I1205 23:22:50.706556 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/82e6bed1-c7a7-4b50-af1e-68415379c41e-var-lock\") pod \"installer-9-crc\" (UID: \"82e6bed1-c7a7-4b50-af1e-68415379c41e\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 23:22:50 crc kubenswrapper[4734]: I1205 23:22:50.706720 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/82e6bed1-c7a7-4b50-af1e-68415379c41e-kubelet-dir\") pod \"installer-9-crc\" (UID: \"82e6bed1-c7a7-4b50-af1e-68415379c41e\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 23:22:50 crc kubenswrapper[4734]: I1205 23:22:50.706683 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/82e6bed1-c7a7-4b50-af1e-68415379c41e-var-lock\") pod \"installer-9-crc\" (UID: \"82e6bed1-c7a7-4b50-af1e-68415379c41e\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 23:22:50 crc kubenswrapper[4734]: I1205 23:22:50.706806 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/82e6bed1-c7a7-4b50-af1e-68415379c41e-kubelet-dir\") pod \"installer-9-crc\" (UID: \"82e6bed1-c7a7-4b50-af1e-68415379c41e\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 23:22:50 crc kubenswrapper[4734]: I1205 23:22:50.728187 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/82e6bed1-c7a7-4b50-af1e-68415379c41e-kube-api-access\") pod \"installer-9-crc\" (UID: \"82e6bed1-c7a7-4b50-af1e-68415379c41e\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 23:22:50 crc kubenswrapper[4734]: E1205 23:22:50.730317 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 05 23:22:50 crc kubenswrapper[4734]: E1205 23:22:50.730593 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tgxsd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-vj45t_openshift-marketplace(7ba0c803-1b80-4161-afa1-c9b6dc65ea00): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 23:22:50 crc kubenswrapper[4734]: E1205 23:22:50.732646 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-vj45t" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" Dec 05 23:22:50 crc kubenswrapper[4734]: E1205 23:22:50.736053 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 05 23:22:50 crc kubenswrapper[4734]: E1205 23:22:50.736230 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t5bcz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-kdt5b_openshift-marketplace(5040a4a1-0b01-4581-89a7-37186c3caebe): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 23:22:50 crc kubenswrapper[4734]: E1205 23:22:50.737890 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-kdt5b" podUID="5040a4a1-0b01-4581-89a7-37186c3caebe" Dec 05 23:22:50 crc kubenswrapper[4734]: E1205 23:22:50.742774 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 05 23:22:50 crc kubenswrapper[4734]: E1205 23:22:50.742971 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-grpqv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-hmv47_openshift-marketplace(7e24c08e-fb74-4ae6-9c48-ae9653c964e8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 23:22:50 crc kubenswrapper[4734]: E1205 23:22:50.744156 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-hmv47" podUID="7e24c08e-fb74-4ae6-9c48-ae9653c964e8" Dec 05 23:22:50 crc kubenswrapper[4734]: E1205 23:22:50.768136 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 05 23:22:50 crc kubenswrapper[4734]: E1205 23:22:50.768331 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hmd7l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-cbr82_openshift-marketplace(e6e9b180-8bc8-4f84-b1f7-ec822b6f6560): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 23:22:50 crc kubenswrapper[4734]: E1205 23:22:50.769690 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-cbr82" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" Dec 05 23:22:50 crc kubenswrapper[4734]: I1205 23:22:50.887116 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 23:22:51 crc kubenswrapper[4734]: I1205 23:22:51.085026 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 23:22:51 crc kubenswrapper[4734]: I1205 23:22:51.095726 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 23:22:51 crc kubenswrapper[4734]: I1205 23:22:51.147191 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-l6r6g"] Dec 05 23:22:51 crc kubenswrapper[4734]: W1205 23:22:51.165103 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod641af4fe_dd54_4118_8985_d37a03d64f79.slice/crio-1212025a9a41d7d20b9aab66e7baa4fd52af539b9b11f840dc71974eae94113f WatchSource:0}: Error finding container 1212025a9a41d7d20b9aab66e7baa4fd52af539b9b11f840dc71974eae94113f: Status 404 returned error can't find the container with id 1212025a9a41d7d20b9aab66e7baa4fd52af539b9b11f840dc71974eae94113f Dec 05 23:22:51 crc kubenswrapper[4734]: I1205 23:22:51.664004 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ddfefb46-d158-4b10-8634-1100abebd2d6","Type":"ContainerStarted","Data":"aa8ea4271a10062bb742a4cc2cbb071e5d3e0c46160fc523c36be22490b6e974"} Dec 05 23:22:51 crc kubenswrapper[4734]: I1205 23:22:51.664332 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ddfefb46-d158-4b10-8634-1100abebd2d6","Type":"ContainerStarted","Data":"19fd057dabc57af49d947bbda93fffeaf928431a4eb19ff8105ea525bd11dec8"} Dec 05 23:22:51 crc kubenswrapper[4734]: I1205 23:22:51.665918 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"82e6bed1-c7a7-4b50-af1e-68415379c41e","Type":"ContainerStarted","Data":"6a0affb05ecd9c22a30f63a029e8adb4f86f39b5fc31b459117f01d2f92b4585"} Dec 05 23:22:51 crc kubenswrapper[4734]: I1205 23:22:51.666009 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"82e6bed1-c7a7-4b50-af1e-68415379c41e","Type":"ContainerStarted","Data":"52c1e9f2b300bb165aa1ec7c0f08c51d486e2f9e7e2de6959430710ea2c67d9d"} Dec 05 23:22:51 crc kubenswrapper[4734]: I1205 23:22:51.669227 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l6r6g" event={"ID":"641af4fe-dd54-4118-8985-d37a03d64f79","Type":"ContainerStarted","Data":"0b054b375a89f7a823b62f69e66c527e63b944624cbe9d2e8706f5cdb9b3acee"} Dec 05 23:22:51 crc kubenswrapper[4734]: I1205 23:22:51.669908 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l6r6g" event={"ID":"641af4fe-dd54-4118-8985-d37a03d64f79","Type":"ContainerStarted","Data":"1212025a9a41d7d20b9aab66e7baa4fd52af539b9b11f840dc71974eae94113f"} Dec 05 23:22:51 crc kubenswrapper[4734]: E1205 23:22:51.678943 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-kdt5b" podUID="5040a4a1-0b01-4581-89a7-37186c3caebe" Dec 05 23:22:51 crc kubenswrapper[4734]: E1205 23:22:51.678937 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-cbr82" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" Dec 05 23:22:51 crc kubenswrapper[4734]: E1205 23:22:51.679190 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-vj45t" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" Dec 05 23:22:51 crc kubenswrapper[4734]: E1205 23:22:51.683989 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-hmv47" podUID="7e24c08e-fb74-4ae6-9c48-ae9653c964e8" Dec 05 23:22:51 crc kubenswrapper[4734]: I1205 23:22:51.695804 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=6.695761653 podStartE2EDuration="6.695761653s" podCreationTimestamp="2025-12-05 23:22:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:51.68343352 +0000 UTC m=+192.366837816" watchObservedRunningTime="2025-12-05 23:22:51.695761653 +0000 UTC m=+192.379165939" Dec 05 23:22:51 crc kubenswrapper[4734]: I1205 23:22:51.744111 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.7440855 podStartE2EDuration="1.7440855s" podCreationTimestamp="2025-12-05 23:22:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:51.742175773 +0000 UTC m=+192.425580049" watchObservedRunningTime="2025-12-05 23:22:51.7440855 +0000 UTC m=+192.427489776" Dec 05 23:22:52 crc kubenswrapper[4734]: I1205 23:22:52.676518 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l6r6g" event={"ID":"641af4fe-dd54-4118-8985-d37a03d64f79","Type":"ContainerStarted","Data":"963a367f11c21ed98a2dcf4007446590af9e7470c44d7284e8ee38c4592df88f"} Dec 05 23:22:52 crc kubenswrapper[4734]: I1205 23:22:52.678970 4734 generic.go:334] "Generic (PLEG): container finished" podID="ddfefb46-d158-4b10-8634-1100abebd2d6" containerID="aa8ea4271a10062bb742a4cc2cbb071e5d3e0c46160fc523c36be22490b6e974" exitCode=0 Dec 05 23:22:52 crc kubenswrapper[4734]: I1205 23:22:52.679039 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ddfefb46-d158-4b10-8634-1100abebd2d6","Type":"ContainerDied","Data":"aa8ea4271a10062bb742a4cc2cbb071e5d3e0c46160fc523c36be22490b6e974"} Dec 05 23:22:52 crc kubenswrapper[4734]: I1205 23:22:52.701384 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-l6r6g" podStartSLOduration=172.701160526 podStartE2EDuration="2m52.701160526s" podCreationTimestamp="2025-12-05 23:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:22:52.697354032 +0000 UTC m=+193.380758328" watchObservedRunningTime="2025-12-05 23:22:52.701160526 +0000 UTC m=+193.384564802" Dec 05 23:22:53 crc kubenswrapper[4734]: I1205 23:22:53.983060 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 23:22:54 crc kubenswrapper[4734]: I1205 23:22:54.156907 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ddfefb46-d158-4b10-8634-1100abebd2d6-kube-api-access\") pod \"ddfefb46-d158-4b10-8634-1100abebd2d6\" (UID: \"ddfefb46-d158-4b10-8634-1100abebd2d6\") " Dec 05 23:22:54 crc kubenswrapper[4734]: I1205 23:22:54.157600 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ddfefb46-d158-4b10-8634-1100abebd2d6-kubelet-dir\") pod \"ddfefb46-d158-4b10-8634-1100abebd2d6\" (UID: \"ddfefb46-d158-4b10-8634-1100abebd2d6\") " Dec 05 23:22:54 crc kubenswrapper[4734]: I1205 23:22:54.158028 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ddfefb46-d158-4b10-8634-1100abebd2d6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ddfefb46-d158-4b10-8634-1100abebd2d6" (UID: "ddfefb46-d158-4b10-8634-1100abebd2d6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:22:54 crc kubenswrapper[4734]: I1205 23:22:54.164773 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddfefb46-d158-4b10-8634-1100abebd2d6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ddfefb46-d158-4b10-8634-1100abebd2d6" (UID: "ddfefb46-d158-4b10-8634-1100abebd2d6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:22:54 crc kubenswrapper[4734]: I1205 23:22:54.259276 4734 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ddfefb46-d158-4b10-8634-1100abebd2d6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 23:22:54 crc kubenswrapper[4734]: I1205 23:22:54.259317 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ddfefb46-d158-4b10-8634-1100abebd2d6-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 23:22:54 crc kubenswrapper[4734]: I1205 23:22:54.693158 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ddfefb46-d158-4b10-8634-1100abebd2d6","Type":"ContainerDied","Data":"19fd057dabc57af49d947bbda93fffeaf928431a4eb19ff8105ea525bd11dec8"} Dec 05 23:22:54 crc kubenswrapper[4734]: I1205 23:22:54.693204 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19fd057dabc57af49d947bbda93fffeaf928431a4eb19ff8105ea525bd11dec8" Dec 05 23:22:54 crc kubenswrapper[4734]: I1205 23:22:54.693271 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 23:22:59 crc kubenswrapper[4734]: I1205 23:22:59.724738 4734 generic.go:334] "Generic (PLEG): container finished" podID="9f0ef57c-6339-4a66-9318-7e594c611080" containerID="1c6784fe57e95af961e37977844f2ef324dfb71a04f42f938d6f8457f90960fa" exitCode=0 Dec 05 23:22:59 crc kubenswrapper[4734]: I1205 23:22:59.724845 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k6qjj" event={"ID":"9f0ef57c-6339-4a66-9318-7e594c611080","Type":"ContainerDied","Data":"1c6784fe57e95af961e37977844f2ef324dfb71a04f42f938d6f8457f90960fa"} Dec 05 23:23:02 crc kubenswrapper[4734]: I1205 23:23:02.749629 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k6qjj" event={"ID":"9f0ef57c-6339-4a66-9318-7e594c611080","Type":"ContainerStarted","Data":"d9231a436993d14e1ad719e0fdbd791e363c27cc8dd18489cee6061e0f22256c"} Dec 05 23:23:02 crc kubenswrapper[4734]: I1205 23:23:02.769408 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k6qjj" podStartSLOduration=3.274569479 podStartE2EDuration="55.769387314s" podCreationTimestamp="2025-12-05 23:22:07 +0000 UTC" firstStartedPulling="2025-12-05 23:22:09.225794787 +0000 UTC m=+149.909199063" lastFinishedPulling="2025-12-05 23:23:01.720612622 +0000 UTC m=+202.404016898" observedRunningTime="2025-12-05 23:23:02.767285912 +0000 UTC m=+203.450690188" watchObservedRunningTime="2025-12-05 23:23:02.769387314 +0000 UTC m=+203.452791590" Dec 05 23:23:03 crc kubenswrapper[4734]: I1205 23:23:03.756851 4734 generic.go:334] "Generic (PLEG): container finished" podID="597348be-fe32-4495-bb10-d152ed593e3e" containerID="efb50aabad288de103b27689366d3b4e6b599e57b689147cbf99a5c136ba5944" exitCode=0 Dec 05 23:23:03 crc kubenswrapper[4734]: I1205 23:23:03.756934 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cb4rj" event={"ID":"597348be-fe32-4495-bb10-d152ed593e3e","Type":"ContainerDied","Data":"efb50aabad288de103b27689366d3b4e6b599e57b689147cbf99a5c136ba5944"} Dec 05 23:23:04 crc kubenswrapper[4734]: I1205 23:23:04.781767 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cbr82" event={"ID":"e6e9b180-8bc8-4f84-b1f7-ec822b6f6560","Type":"ContainerStarted","Data":"7f0890e578143a4ac9443173ff9c6d3c9de42bd3bc235b95df529824f2955319"} Dec 05 23:23:05 crc kubenswrapper[4734]: I1205 23:23:05.793983 4734 generic.go:334] "Generic (PLEG): container finished" podID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" containerID="7f0890e578143a4ac9443173ff9c6d3c9de42bd3bc235b95df529824f2955319" exitCode=0 Dec 05 23:23:05 crc kubenswrapper[4734]: I1205 23:23:05.794450 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cbr82" event={"ID":"e6e9b180-8bc8-4f84-b1f7-ec822b6f6560","Type":"ContainerDied","Data":"7f0890e578143a4ac9443173ff9c6d3c9de42bd3bc235b95df529824f2955319"} Dec 05 23:23:08 crc kubenswrapper[4734]: I1205 23:23:08.040508 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k6qjj" Dec 05 23:23:08 crc kubenswrapper[4734]: I1205 23:23:08.040601 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k6qjj" Dec 05 23:23:08 crc kubenswrapper[4734]: I1205 23:23:08.792292 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k6qjj" Dec 05 23:23:08 crc kubenswrapper[4734]: I1205 23:23:08.860817 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k6qjj" Dec 05 23:23:11 crc kubenswrapper[4734]: I1205 23:23:11.852841 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k6qjj"] Dec 05 23:23:11 crc kubenswrapper[4734]: I1205 23:23:11.853472 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k6qjj" podUID="9f0ef57c-6339-4a66-9318-7e594c611080" containerName="registry-server" containerID="cri-o://d9231a436993d14e1ad719e0fdbd791e363c27cc8dd18489cee6061e0f22256c" gracePeriod=2 Dec 05 23:23:14 crc kubenswrapper[4734]: I1205 23:23:14.859073 4734 generic.go:334] "Generic (PLEG): container finished" podID="9f0ef57c-6339-4a66-9318-7e594c611080" containerID="d9231a436993d14e1ad719e0fdbd791e363c27cc8dd18489cee6061e0f22256c" exitCode=0 Dec 05 23:23:14 crc kubenswrapper[4734]: I1205 23:23:14.859113 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k6qjj" event={"ID":"9f0ef57c-6339-4a66-9318-7e594c611080","Type":"ContainerDied","Data":"d9231a436993d14e1ad719e0fdbd791e363c27cc8dd18489cee6061e0f22256c"} Dec 05 23:23:15 crc kubenswrapper[4734]: I1205 23:23:15.790852 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k6qjj" Dec 05 23:23:15 crc kubenswrapper[4734]: I1205 23:23:15.877406 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k6qjj" event={"ID":"9f0ef57c-6339-4a66-9318-7e594c611080","Type":"ContainerDied","Data":"2ad408f4b63806e3458fc92a15f61be5aa1cad05e76069171e0bcd192735ca15"} Dec 05 23:23:15 crc kubenswrapper[4734]: I1205 23:23:15.877478 4734 scope.go:117] "RemoveContainer" containerID="d9231a436993d14e1ad719e0fdbd791e363c27cc8dd18489cee6061e0f22256c" Dec 05 23:23:15 crc kubenswrapper[4734]: I1205 23:23:15.877574 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k6qjj" Dec 05 23:23:15 crc kubenswrapper[4734]: I1205 23:23:15.895224 4734 scope.go:117] "RemoveContainer" containerID="1c6784fe57e95af961e37977844f2ef324dfb71a04f42f938d6f8457f90960fa" Dec 05 23:23:15 crc kubenswrapper[4734]: I1205 23:23:15.912009 4734 scope.go:117] "RemoveContainer" containerID="87a98d7ad0d34bc09562daa39f25e6f8874358b78261fea21c41bfa98c2ea386" Dec 05 23:23:15 crc kubenswrapper[4734]: I1205 23:23:15.987793 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f0ef57c-6339-4a66-9318-7e594c611080-utilities\") pod \"9f0ef57c-6339-4a66-9318-7e594c611080\" (UID: \"9f0ef57c-6339-4a66-9318-7e594c611080\") " Dec 05 23:23:15 crc kubenswrapper[4734]: I1205 23:23:15.988179 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ct6q\" (UniqueName: \"kubernetes.io/projected/9f0ef57c-6339-4a66-9318-7e594c611080-kube-api-access-9ct6q\") pod \"9f0ef57c-6339-4a66-9318-7e594c611080\" (UID: \"9f0ef57c-6339-4a66-9318-7e594c611080\") " Dec 05 23:23:15 crc kubenswrapper[4734]: I1205 23:23:15.988269 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f0ef57c-6339-4a66-9318-7e594c611080-catalog-content\") pod \"9f0ef57c-6339-4a66-9318-7e594c611080\" (UID: \"9f0ef57c-6339-4a66-9318-7e594c611080\") " Dec 05 23:23:15 crc kubenswrapper[4734]: I1205 23:23:15.988941 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f0ef57c-6339-4a66-9318-7e594c611080-utilities" (OuterVolumeSpecName: "utilities") pod "9f0ef57c-6339-4a66-9318-7e594c611080" (UID: "9f0ef57c-6339-4a66-9318-7e594c611080"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:23:15 crc kubenswrapper[4734]: I1205 23:23:15.995553 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f0ef57c-6339-4a66-9318-7e594c611080-kube-api-access-9ct6q" (OuterVolumeSpecName: "kube-api-access-9ct6q") pod "9f0ef57c-6339-4a66-9318-7e594c611080" (UID: "9f0ef57c-6339-4a66-9318-7e594c611080"). InnerVolumeSpecName "kube-api-access-9ct6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:23:16 crc kubenswrapper[4734]: I1205 23:23:16.008911 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f0ef57c-6339-4a66-9318-7e594c611080-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f0ef57c-6339-4a66-9318-7e594c611080" (UID: "9f0ef57c-6339-4a66-9318-7e594c611080"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:23:16 crc kubenswrapper[4734]: I1205 23:23:16.090847 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f0ef57c-6339-4a66-9318-7e594c611080-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:16 crc kubenswrapper[4734]: I1205 23:23:16.091461 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ct6q\" (UniqueName: \"kubernetes.io/projected/9f0ef57c-6339-4a66-9318-7e594c611080-kube-api-access-9ct6q\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:16 crc kubenswrapper[4734]: I1205 23:23:16.091788 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f0ef57c-6339-4a66-9318-7e594c611080-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:16 crc kubenswrapper[4734]: I1205 23:23:16.222649 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k6qjj"] Dec 05 23:23:16 crc kubenswrapper[4734]: I1205 23:23:16.225870 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k6qjj"] Dec 05 23:23:17 crc kubenswrapper[4734]: I1205 23:23:17.630748 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f0ef57c-6339-4a66-9318-7e594c611080" path="/var/lib/kubelet/pods/9f0ef57c-6339-4a66-9318-7e594c611080/volumes" Dec 05 23:23:17 crc kubenswrapper[4734]: I1205 23:23:17.897200 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kdt5b" event={"ID":"5040a4a1-0b01-4581-89a7-37186c3caebe","Type":"ContainerStarted","Data":"d2bfac5855f26fc9b1373233af371cc973951d17c8e5c26ecf0de2fbcc8c8ac1"} Dec 05 23:23:17 crc kubenswrapper[4734]: I1205 23:23:17.900074 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cb4rj" event={"ID":"597348be-fe32-4495-bb10-d152ed593e3e","Type":"ContainerStarted","Data":"8f3a5e56a02fbdb8a0a5a8736fd80d28d5c42b5deb684663ff6910b3b0b752bd"} Dec 05 23:23:17 crc kubenswrapper[4734]: I1205 23:23:17.902091 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmv47" event={"ID":"7e24c08e-fb74-4ae6-9c48-ae9653c964e8","Type":"ContainerStarted","Data":"a7e3f3325c47bebf23eaa664ce1f52666c0ad29104a891802550e978df6f9dc9"} Dec 05 23:23:17 crc kubenswrapper[4734]: I1205 23:23:17.903962 4734 generic.go:334] "Generic (PLEG): container finished" podID="15bf5615-0adc-46cd-8796-d419076acac7" containerID="9a9d481c6611f4c79cc0f91b17882461c5b0e02a7620eaef3f8402702c6c1dcb" exitCode=0 Dec 05 23:23:17 crc kubenswrapper[4734]: I1205 23:23:17.904011 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-67l49" event={"ID":"15bf5615-0adc-46cd-8796-d419076acac7","Type":"ContainerDied","Data":"9a9d481c6611f4c79cc0f91b17882461c5b0e02a7620eaef3f8402702c6c1dcb"} Dec 05 23:23:18 crc kubenswrapper[4734]: I1205 23:23:18.341225 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gxdpj"] Dec 05 23:23:18 crc kubenswrapper[4734]: I1205 23:23:18.913818 4734 generic.go:334] "Generic (PLEG): container finished" podID="7e24c08e-fb74-4ae6-9c48-ae9653c964e8" containerID="a7e3f3325c47bebf23eaa664ce1f52666c0ad29104a891802550e978df6f9dc9" exitCode=0 Dec 05 23:23:18 crc kubenswrapper[4734]: I1205 23:23:18.913901 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmv47" event={"ID":"7e24c08e-fb74-4ae6-9c48-ae9653c964e8","Type":"ContainerDied","Data":"a7e3f3325c47bebf23eaa664ce1f52666c0ad29104a891802550e978df6f9dc9"} Dec 05 23:23:18 crc kubenswrapper[4734]: I1205 23:23:18.916866 4734 generic.go:334] "Generic (PLEG): container finished" podID="5040a4a1-0b01-4581-89a7-37186c3caebe" containerID="d2bfac5855f26fc9b1373233af371cc973951d17c8e5c26ecf0de2fbcc8c8ac1" exitCode=0 Dec 05 23:23:18 crc kubenswrapper[4734]: I1205 23:23:18.916965 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kdt5b" event={"ID":"5040a4a1-0b01-4581-89a7-37186c3caebe","Type":"ContainerDied","Data":"d2bfac5855f26fc9b1373233af371cc973951d17c8e5c26ecf0de2fbcc8c8ac1"} Dec 05 23:23:18 crc kubenswrapper[4734]: I1205 23:23:18.963823 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cb4rj" podStartSLOduration=7.496207934 podStartE2EDuration="1m13.96379998s" podCreationTimestamp="2025-12-05 23:22:05 +0000 UTC" firstStartedPulling="2025-12-05 23:22:07.943266554 +0000 UTC m=+148.626670830" lastFinishedPulling="2025-12-05 23:23:14.4108586 +0000 UTC m=+215.094262876" observedRunningTime="2025-12-05 23:23:18.958397322 +0000 UTC m=+219.641801608" watchObservedRunningTime="2025-12-05 23:23:18.96379998 +0000 UTC m=+219.647204256" Dec 05 23:23:19 crc kubenswrapper[4734]: I1205 23:23:19.924890 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vj45t" event={"ID":"7ba0c803-1b80-4161-afa1-c9b6dc65ea00","Type":"ContainerStarted","Data":"95a8a98bccba41294acbadfb078a33cdb8cfaf47c941ac39c5a646b1b981473d"} Dec 05 23:23:19 crc kubenswrapper[4734]: I1205 23:23:19.927544 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gnbkp" event={"ID":"4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4","Type":"ContainerStarted","Data":"5a80b971db92270e639c74e75d715e51fd9aa35033f323865636c2d7e7f770ab"} Dec 05 23:23:19 crc kubenswrapper[4734]: I1205 23:23:19.930458 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cbr82" event={"ID":"e6e9b180-8bc8-4f84-b1f7-ec822b6f6560","Type":"ContainerStarted","Data":"bea406031d4e50c3113dd26dc4f848e9cc4e8b57d3bc63e09066d4009a36be34"} Dec 05 23:23:19 crc kubenswrapper[4734]: I1205 23:23:19.989499 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cbr82" podStartSLOduration=3.561843589 podStartE2EDuration="1m14.989478088s" podCreationTimestamp="2025-12-05 23:22:05 +0000 UTC" firstStartedPulling="2025-12-05 23:22:07.9178573 +0000 UTC m=+148.601261576" lastFinishedPulling="2025-12-05 23:23:19.345491799 +0000 UTC m=+220.028896075" observedRunningTime="2025-12-05 23:23:19.986111599 +0000 UTC m=+220.669515875" watchObservedRunningTime="2025-12-05 23:23:19.989478088 +0000 UTC m=+220.672882364" Dec 05 23:23:20 crc kubenswrapper[4734]: I1205 23:23:20.445123 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:23:20 crc kubenswrapper[4734]: I1205 23:23:20.446006 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:23:20 crc kubenswrapper[4734]: I1205 23:23:20.446167 4734 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 05 23:23:20 crc kubenswrapper[4734]: I1205 23:23:20.446999 4734 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18"} pod="openshift-machine-config-operator/machine-config-daemon-vn94d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 23:23:20 crc kubenswrapper[4734]: I1205 23:23:20.447216 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" containerID="cri-o://2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18" gracePeriod=600 Dec 05 23:23:20 crc kubenswrapper[4734]: I1205 23:23:20.939185 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kdt5b" event={"ID":"5040a4a1-0b01-4581-89a7-37186c3caebe","Type":"ContainerStarted","Data":"8c612793464e454a21b04c406c2a3457cc0a807a995eba7d6716550f916c9b8b"} Dec 05 23:23:20 crc kubenswrapper[4734]: I1205 23:23:20.941912 4734 generic.go:334] "Generic (PLEG): container finished" podID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" containerID="95a8a98bccba41294acbadfb078a33cdb8cfaf47c941ac39c5a646b1b981473d" exitCode=0 Dec 05 23:23:20 crc kubenswrapper[4734]: I1205 23:23:20.941990 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vj45t" event={"ID":"7ba0c803-1b80-4161-afa1-c9b6dc65ea00","Type":"ContainerDied","Data":"95a8a98bccba41294acbadfb078a33cdb8cfaf47c941ac39c5a646b1b981473d"} Dec 05 23:23:20 crc kubenswrapper[4734]: I1205 23:23:20.945577 4734 generic.go:334] "Generic (PLEG): container finished" podID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" containerID="5a80b971db92270e639c74e75d715e51fd9aa35033f323865636c2d7e7f770ab" exitCode=0 Dec 05 23:23:20 crc kubenswrapper[4734]: I1205 23:23:20.945681 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gnbkp" event={"ID":"4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4","Type":"ContainerDied","Data":"5a80b971db92270e639c74e75d715e51fd9aa35033f323865636c2d7e7f770ab"} Dec 05 23:23:20 crc kubenswrapper[4734]: I1205 23:23:20.953242 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmv47" event={"ID":"7e24c08e-fb74-4ae6-9c48-ae9653c964e8","Type":"ContainerStarted","Data":"c893a3f02e495017bbd5ae00c480a8729469b43d7ab234724e209dc00e833de8"} Dec 05 23:23:20 crc kubenswrapper[4734]: I1205 23:23:20.957407 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-67l49" event={"ID":"15bf5615-0adc-46cd-8796-d419076acac7","Type":"ContainerStarted","Data":"83a62aec78d9ca65c2211b66e19eb5232b6b0a0b1aa4fe44bda308d235afaae3"} Dec 05 23:23:20 crc kubenswrapper[4734]: I1205 23:23:20.965854 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kdt5b" podStartSLOduration=2.617204429 podStartE2EDuration="1m13.965834155s" podCreationTimestamp="2025-12-05 23:22:07 +0000 UTC" firstStartedPulling="2025-12-05 23:22:09.139756783 +0000 UTC m=+149.823161059" lastFinishedPulling="2025-12-05 23:23:20.488386509 +0000 UTC m=+221.171790785" observedRunningTime="2025-12-05 23:23:20.965132684 +0000 UTC m=+221.648536960" watchObservedRunningTime="2025-12-05 23:23:20.965834155 +0000 UTC m=+221.649238431" Dec 05 23:23:20 crc kubenswrapper[4734]: I1205 23:23:20.966863 4734 generic.go:334] "Generic (PLEG): container finished" podID="65758270-a7a7-46b5-af95-0588daf9fa86" containerID="2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18" exitCode=0 Dec 05 23:23:20 crc kubenswrapper[4734]: I1205 23:23:20.966912 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerDied","Data":"2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18"} Dec 05 23:23:20 crc kubenswrapper[4734]: I1205 23:23:20.966941 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerStarted","Data":"d4346c20725cce5df929f1d9a537d5302866dcd17b21ee10d0662364730d69a9"} Dec 05 23:23:21 crc kubenswrapper[4734]: I1205 23:23:21.013736 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-67l49" podStartSLOduration=3.761494207 podStartE2EDuration="1m16.013715653s" podCreationTimestamp="2025-12-05 23:22:05 +0000 UTC" firstStartedPulling="2025-12-05 23:22:08.041902789 +0000 UTC m=+148.725307065" lastFinishedPulling="2025-12-05 23:23:20.294124235 +0000 UTC m=+220.977528511" observedRunningTime="2025-12-05 23:23:21.01054785 +0000 UTC m=+221.693952136" watchObservedRunningTime="2025-12-05 23:23:21.013715653 +0000 UTC m=+221.697119929" Dec 05 23:23:21 crc kubenswrapper[4734]: I1205 23:23:21.042601 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hmv47" podStartSLOduration=3.626550834 podStartE2EDuration="1m16.042570486s" podCreationTimestamp="2025-12-05 23:22:05 +0000 UTC" firstStartedPulling="2025-12-05 23:22:07.951486457 +0000 UTC m=+148.634890733" lastFinishedPulling="2025-12-05 23:23:20.367506109 +0000 UTC m=+221.050910385" observedRunningTime="2025-12-05 23:23:21.037209549 +0000 UTC m=+221.720613835" watchObservedRunningTime="2025-12-05 23:23:21.042570486 +0000 UTC m=+221.725974752" Dec 05 23:23:21 crc kubenswrapper[4734]: I1205 23:23:21.982830 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gnbkp" event={"ID":"4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4","Type":"ContainerStarted","Data":"0ad97f0b80b9db497c44d78afb09b9217c04438574d4c9f552853398eddc04ec"} Dec 05 23:23:22 crc kubenswrapper[4734]: I1205 23:23:22.002306 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gnbkp" podStartSLOduration=2.699510419 podStartE2EDuration="1m13.002284206s" podCreationTimestamp="2025-12-05 23:22:09 +0000 UTC" firstStartedPulling="2025-12-05 23:22:11.291250466 +0000 UTC m=+151.974654742" lastFinishedPulling="2025-12-05 23:23:21.594024253 +0000 UTC m=+222.277428529" observedRunningTime="2025-12-05 23:23:22.000844124 +0000 UTC m=+222.684248400" watchObservedRunningTime="2025-12-05 23:23:22.002284206 +0000 UTC m=+222.685688472" Dec 05 23:23:22 crc kubenswrapper[4734]: I1205 23:23:22.991757 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vj45t" event={"ID":"7ba0c803-1b80-4161-afa1-c9b6dc65ea00","Type":"ContainerStarted","Data":"5b110ebd20ad70373883a8604089e967c00f134bcab314c5426fd46a1753c413"} Dec 05 23:23:23 crc kubenswrapper[4734]: I1205 23:23:23.016682 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vj45t" podStartSLOduration=3.328686661 podStartE2EDuration="1m15.016652624s" podCreationTimestamp="2025-12-05 23:22:08 +0000 UTC" firstStartedPulling="2025-12-05 23:22:10.234400539 +0000 UTC m=+150.917804815" lastFinishedPulling="2025-12-05 23:23:21.922366502 +0000 UTC m=+222.605770778" observedRunningTime="2025-12-05 23:23:23.012182792 +0000 UTC m=+223.695587088" watchObservedRunningTime="2025-12-05 23:23:23.016652624 +0000 UTC m=+223.700056900" Dec 05 23:23:25 crc kubenswrapper[4734]: I1205 23:23:25.668348 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hmv47" Dec 05 23:23:25 crc kubenswrapper[4734]: I1205 23:23:25.668424 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hmv47" Dec 05 23:23:25 crc kubenswrapper[4734]: I1205 23:23:25.850367 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cb4rj" Dec 05 23:23:25 crc kubenswrapper[4734]: I1205 23:23:25.850850 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cb4rj" Dec 05 23:23:25 crc kubenswrapper[4734]: I1205 23:23:25.892636 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cb4rj" Dec 05 23:23:26 crc kubenswrapper[4734]: I1205 23:23:26.049436 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cb4rj" Dec 05 23:23:26 crc kubenswrapper[4734]: I1205 23:23:26.083934 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cbr82" Dec 05 23:23:26 crc kubenswrapper[4734]: I1205 23:23:26.084002 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cbr82" Dec 05 23:23:26 crc kubenswrapper[4734]: I1205 23:23:26.135737 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cbr82" Dec 05 23:23:26 crc kubenswrapper[4734]: I1205 23:23:26.161380 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hmv47" Dec 05 23:23:26 crc kubenswrapper[4734]: I1205 23:23:26.221817 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hmv47" Dec 05 23:23:26 crc kubenswrapper[4734]: I1205 23:23:26.248965 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-67l49" Dec 05 23:23:26 crc kubenswrapper[4734]: I1205 23:23:26.249036 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-67l49" Dec 05 23:23:26 crc kubenswrapper[4734]: I1205 23:23:26.291983 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-67l49" Dec 05 23:23:27 crc kubenswrapper[4734]: I1205 23:23:27.060655 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-67l49" Dec 05 23:23:27 crc kubenswrapper[4734]: I1205 23:23:27.065775 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cbr82" Dec 05 23:23:27 crc kubenswrapper[4734]: I1205 23:23:27.533232 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kdt5b" Dec 05 23:23:27 crc kubenswrapper[4734]: I1205 23:23:27.533331 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kdt5b" Dec 05 23:23:27 crc kubenswrapper[4734]: I1205 23:23:27.582820 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kdt5b" Dec 05 23:23:28 crc kubenswrapper[4734]: I1205 23:23:28.062885 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kdt5b" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.041736 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vj45t" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.041863 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vj45t" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.113612 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vj45t" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.154345 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-67l49"] Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.154674 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-67l49" podUID="15bf5615-0adc-46cd-8796-d419076acac7" containerName="registry-server" containerID="cri-o://83a62aec78d9ca65c2211b66e19eb5232b6b0a0b1aa4fe44bda308d235afaae3" gracePeriod=2 Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.216489 4734 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 23:23:29 crc kubenswrapper[4734]: E1205 23:23:29.216779 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddfefb46-d158-4b10-8634-1100abebd2d6" containerName="pruner" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.216809 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddfefb46-d158-4b10-8634-1100abebd2d6" containerName="pruner" Dec 05 23:23:29 crc kubenswrapper[4734]: E1205 23:23:29.216830 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0ef57c-6339-4a66-9318-7e594c611080" containerName="extract-utilities" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.216841 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0ef57c-6339-4a66-9318-7e594c611080" containerName="extract-utilities" Dec 05 23:23:29 crc kubenswrapper[4734]: E1205 23:23:29.216861 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0ef57c-6339-4a66-9318-7e594c611080" containerName="registry-server" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.216868 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0ef57c-6339-4a66-9318-7e594c611080" containerName="registry-server" Dec 05 23:23:29 crc kubenswrapper[4734]: E1205 23:23:29.216879 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0ef57c-6339-4a66-9318-7e594c611080" containerName="extract-content" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.216886 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0ef57c-6339-4a66-9318-7e594c611080" containerName="extract-content" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.216997 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f0ef57c-6339-4a66-9318-7e594c611080" containerName="registry-server" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.217015 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddfefb46-d158-4b10-8634-1100abebd2d6" containerName="pruner" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.217379 4734 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.217557 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.217787 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0" gracePeriod=15 Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.217823 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818" gracePeriod=15 Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.217859 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6" gracePeriod=15 Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.217828 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097" gracePeriod=15 Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.217830 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30" gracePeriod=15 Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.219460 4734 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 23:23:29 crc kubenswrapper[4734]: E1205 23:23:29.219809 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.219827 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 05 23:23:29 crc kubenswrapper[4734]: E1205 23:23:29.219842 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.219856 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 23:23:29 crc kubenswrapper[4734]: E1205 23:23:29.219866 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.219891 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 23:23:29 crc kubenswrapper[4734]: E1205 23:23:29.219904 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.219910 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 23:23:29 crc kubenswrapper[4734]: E1205 23:23:29.219920 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.219926 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 23:23:29 crc kubenswrapper[4734]: E1205 23:23:29.219946 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.219973 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.220098 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.220139 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.220147 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.220154 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.220165 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.279308 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.312875 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.313637 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.313804 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.313947 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.314078 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.314253 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.314424 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.314684 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.352709 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cbr82"] Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.353106 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cbr82" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" containerName="registry-server" containerID="cri-o://bea406031d4e50c3113dd26dc4f848e9cc4e8b57d3bc63e09066d4009a36be34" gracePeriod=2 Dec 05 23:23:29 crc kubenswrapper[4734]: I1205 23:23:29.383225 4734 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.024882 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.024952 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.024990 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.025014 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.025047 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.025068 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.025120 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.025174 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.025281 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.025339 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.025378 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.025414 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.025448 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.025481 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.025512 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.025566 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.036880 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gnbkp" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.036926 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gnbkp" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.095024 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gnbkp" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.096036 4734 status_manager.go:851] "Failed to get status for pod" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" pod="openshift-marketplace/redhat-operators-gnbkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gnbkp\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.139824 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vj45t" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.140497 4734 status_manager.go:851] "Failed to get status for pod" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" pod="openshift-marketplace/redhat-operators-vj45t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vj45t\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.141303 4734 status_manager.go:851] "Failed to get status for pod" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" pod="openshift-marketplace/redhat-operators-gnbkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gnbkp\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.150366 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gnbkp" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.151195 4734 status_manager.go:851] "Failed to get status for pod" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" pod="openshift-marketplace/redhat-operators-gnbkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gnbkp\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.151401 4734 status_manager.go:851] "Failed to get status for pod" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" pod="openshift-marketplace/redhat-operators-vj45t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vj45t\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.167044 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 23:23:30 crc kubenswrapper[4734]: W1205 23:23:30.254357 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-cecefdd758d496d28b8b21fcba727bf0fab78e1e64fc6581d5e092f7dc727f5e WatchSource:0}: Error finding container cecefdd758d496d28b8b21fcba727bf0fab78e1e64fc6581d5e092f7dc727f5e: Status 404 returned error can't find the container with id cecefdd758d496d28b8b21fcba727bf0fab78e1e64fc6581d5e092f7dc727f5e Dec 05 23:23:30 crc kubenswrapper[4734]: E1205 23:23:30.259907 4734 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.38:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e753d55319799 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 23:23:30.258581401 +0000 UTC m=+230.941985677,LastTimestamp:2025-12-05 23:23:30.258581401 +0000 UTC m=+230.941985677,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.440784 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cbr82" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.442631 4734 status_manager.go:851] "Failed to get status for pod" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" pod="openshift-marketplace/redhat-operators-gnbkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gnbkp\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.443314 4734 status_manager.go:851] "Failed to get status for pod" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" pod="openshift-marketplace/certified-operators-cbr82" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cbr82\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.443695 4734 status_manager.go:851] "Failed to get status for pod" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" pod="openshift-marketplace/redhat-operators-vj45t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vj45t\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.536969 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6e9b180-8bc8-4f84-b1f7-ec822b6f6560-catalog-content\") pod \"e6e9b180-8bc8-4f84-b1f7-ec822b6f6560\" (UID: \"e6e9b180-8bc8-4f84-b1f7-ec822b6f6560\") " Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.537034 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmd7l\" (UniqueName: \"kubernetes.io/projected/e6e9b180-8bc8-4f84-b1f7-ec822b6f6560-kube-api-access-hmd7l\") pod \"e6e9b180-8bc8-4f84-b1f7-ec822b6f6560\" (UID: \"e6e9b180-8bc8-4f84-b1f7-ec822b6f6560\") " Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.537106 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6e9b180-8bc8-4f84-b1f7-ec822b6f6560-utilities\") pod \"e6e9b180-8bc8-4f84-b1f7-ec822b6f6560\" (UID: \"e6e9b180-8bc8-4f84-b1f7-ec822b6f6560\") " Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.538106 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6e9b180-8bc8-4f84-b1f7-ec822b6f6560-utilities" (OuterVolumeSpecName: "utilities") pod "e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" (UID: "e6e9b180-8bc8-4f84-b1f7-ec822b6f6560"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.543176 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6e9b180-8bc8-4f84-b1f7-ec822b6f6560-kube-api-access-hmd7l" (OuterVolumeSpecName: "kube-api-access-hmd7l") pod "e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" (UID: "e6e9b180-8bc8-4f84-b1f7-ec822b6f6560"). InnerVolumeSpecName "kube-api-access-hmd7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:23:30 crc kubenswrapper[4734]: E1205 23:23:30.598751 4734 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.38:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e753d55319799 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 23:23:30.258581401 +0000 UTC m=+230.941985677,LastTimestamp:2025-12-05 23:23:30.258581401 +0000 UTC m=+230.941985677,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.612573 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6e9b180-8bc8-4f84-b1f7-ec822b6f6560-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" (UID: "e6e9b180-8bc8-4f84-b1f7-ec822b6f6560"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.639119 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6e9b180-8bc8-4f84-b1f7-ec822b6f6560-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.639211 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6e9b180-8bc8-4f84-b1f7-ec822b6f6560-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.639261 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmd7l\" (UniqueName: \"kubernetes.io/projected/e6e9b180-8bc8-4f84-b1f7-ec822b6f6560-kube-api-access-hmd7l\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.661623 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-67l49" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.662433 4734 status_manager.go:851] "Failed to get status for pod" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" pod="openshift-marketplace/redhat-operators-gnbkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gnbkp\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.662969 4734 status_manager.go:851] "Failed to get status for pod" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" pod="openshift-marketplace/certified-operators-cbr82" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cbr82\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.663203 4734 status_manager.go:851] "Failed to get status for pod" podUID="15bf5615-0adc-46cd-8796-d419076acac7" pod="openshift-marketplace/community-operators-67l49" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-67l49\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.663423 4734 status_manager.go:851] "Failed to get status for pod" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" pod="openshift-marketplace/redhat-operators-vj45t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vj45t\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.842725 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15bf5615-0adc-46cd-8796-d419076acac7-utilities\") pod \"15bf5615-0adc-46cd-8796-d419076acac7\" (UID: \"15bf5615-0adc-46cd-8796-d419076acac7\") " Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.842815 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15bf5615-0adc-46cd-8796-d419076acac7-catalog-content\") pod \"15bf5615-0adc-46cd-8796-d419076acac7\" (UID: \"15bf5615-0adc-46cd-8796-d419076acac7\") " Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.842872 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7hrp\" (UniqueName: \"kubernetes.io/projected/15bf5615-0adc-46cd-8796-d419076acac7-kube-api-access-f7hrp\") pod \"15bf5615-0adc-46cd-8796-d419076acac7\" (UID: \"15bf5615-0adc-46cd-8796-d419076acac7\") " Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.843776 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15bf5615-0adc-46cd-8796-d419076acac7-utilities" (OuterVolumeSpecName: "utilities") pod "15bf5615-0adc-46cd-8796-d419076acac7" (UID: "15bf5615-0adc-46cd-8796-d419076acac7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.849238 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15bf5615-0adc-46cd-8796-d419076acac7-kube-api-access-f7hrp" (OuterVolumeSpecName: "kube-api-access-f7hrp") pod "15bf5615-0adc-46cd-8796-d419076acac7" (UID: "15bf5615-0adc-46cd-8796-d419076acac7"). InnerVolumeSpecName "kube-api-access-f7hrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.902582 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15bf5615-0adc-46cd-8796-d419076acac7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15bf5615-0adc-46cd-8796-d419076acac7" (UID: "15bf5615-0adc-46cd-8796-d419076acac7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.944357 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15bf5615-0adc-46cd-8796-d419076acac7-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.944421 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15bf5615-0adc-46cd-8796-d419076acac7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:30 crc kubenswrapper[4734]: I1205 23:23:30.944442 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7hrp\" (UniqueName: \"kubernetes.io/projected/15bf5615-0adc-46cd-8796-d419076acac7-kube-api-access-f7hrp\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.103035 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"37f5547a1e7a533696a3421088bfced9cb82e86649446f6c81beda8303138121"} Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.103340 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"cecefdd758d496d28b8b21fcba727bf0fab78e1e64fc6581d5e092f7dc727f5e"} Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.103946 4734 status_manager.go:851] "Failed to get status for pod" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" pod="openshift-marketplace/redhat-operators-gnbkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gnbkp\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.104686 4734 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.105274 4734 status_manager.go:851] "Failed to get status for pod" podUID="15bf5615-0adc-46cd-8796-d419076acac7" pod="openshift-marketplace/community-operators-67l49" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-67l49\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.105662 4734 status_manager.go:851] "Failed to get status for pod" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" pod="openshift-marketplace/certified-operators-cbr82" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cbr82\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.106059 4734 status_manager.go:851] "Failed to get status for pod" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" pod="openshift-marketplace/redhat-operators-vj45t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vj45t\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.106424 4734 generic.go:334] "Generic (PLEG): container finished" podID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" containerID="bea406031d4e50c3113dd26dc4f848e9cc4e8b57d3bc63e09066d4009a36be34" exitCode=0 Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.106550 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cbr82" event={"ID":"e6e9b180-8bc8-4f84-b1f7-ec822b6f6560","Type":"ContainerDied","Data":"bea406031d4e50c3113dd26dc4f848e9cc4e8b57d3bc63e09066d4009a36be34"} Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.106584 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cbr82" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.106673 4734 scope.go:117] "RemoveContainer" containerID="bea406031d4e50c3113dd26dc4f848e9cc4e8b57d3bc63e09066d4009a36be34" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.106703 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cbr82" event={"ID":"e6e9b180-8bc8-4f84-b1f7-ec822b6f6560","Type":"ContainerDied","Data":"ebf3e81e7fcb0ee47bfecad9782be3dbcb5e1b8f5fb09825ab1423037466053d"} Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.107429 4734 status_manager.go:851] "Failed to get status for pod" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" pod="openshift-marketplace/redhat-operators-gnbkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gnbkp\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.107923 4734 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.108873 4734 status_manager.go:851] "Failed to get status for pod" podUID="15bf5615-0adc-46cd-8796-d419076acac7" pod="openshift-marketplace/community-operators-67l49" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-67l49\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.109271 4734 status_manager.go:851] "Failed to get status for pod" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" pod="openshift-marketplace/certified-operators-cbr82" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cbr82\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.109782 4734 status_manager.go:851] "Failed to get status for pod" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" pod="openshift-marketplace/redhat-operators-vj45t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vj45t\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.110813 4734 generic.go:334] "Generic (PLEG): container finished" podID="15bf5615-0adc-46cd-8796-d419076acac7" containerID="83a62aec78d9ca65c2211b66e19eb5232b6b0a0b1aa4fe44bda308d235afaae3" exitCode=0 Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.110905 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-67l49" event={"ID":"15bf5615-0adc-46cd-8796-d419076acac7","Type":"ContainerDied","Data":"83a62aec78d9ca65c2211b66e19eb5232b6b0a0b1aa4fe44bda308d235afaae3"} Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.110946 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-67l49" event={"ID":"15bf5615-0adc-46cd-8796-d419076acac7","Type":"ContainerDied","Data":"b5d0cb9a1d47412c5a027563ad707bc45f330a172c608b781e43c928256c1454"} Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.111013 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-67l49" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.112164 4734 status_manager.go:851] "Failed to get status for pod" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" pod="openshift-marketplace/redhat-operators-vj45t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vj45t\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.112473 4734 status_manager.go:851] "Failed to get status for pod" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" pod="openshift-marketplace/redhat-operators-gnbkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gnbkp\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.112866 4734 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.115673 4734 status_manager.go:851] "Failed to get status for pod" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" pod="openshift-marketplace/certified-operators-cbr82" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cbr82\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.116275 4734 status_manager.go:851] "Failed to get status for pod" podUID="15bf5615-0adc-46cd-8796-d419076acac7" pod="openshift-marketplace/community-operators-67l49" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-67l49\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.117270 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"82e6bed1-c7a7-4b50-af1e-68415379c41e","Type":"ContainerDied","Data":"6a0affb05ecd9c22a30f63a029e8adb4f86f39b5fc31b459117f01d2f92b4585"} Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.118277 4734 status_manager.go:851] "Failed to get status for pod" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" pod="openshift-marketplace/redhat-operators-gnbkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gnbkp\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.118502 4734 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.118872 4734 status_manager.go:851] "Failed to get status for pod" podUID="15bf5615-0adc-46cd-8796-d419076acac7" pod="openshift-marketplace/community-operators-67l49" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-67l49\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.119195 4734 status_manager.go:851] "Failed to get status for pod" podUID="82e6bed1-c7a7-4b50-af1e-68415379c41e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.119511 4734 status_manager.go:851] "Failed to get status for pod" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" pod="openshift-marketplace/certified-operators-cbr82" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cbr82\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.119897 4734 status_manager.go:851] "Failed to get status for pod" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" pod="openshift-marketplace/redhat-operators-vj45t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vj45t\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.117168 4734 generic.go:334] "Generic (PLEG): container finished" podID="82e6bed1-c7a7-4b50-af1e-68415379c41e" containerID="6a0affb05ecd9c22a30f63a029e8adb4f86f39b5fc31b459117f01d2f92b4585" exitCode=0 Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.127879 4734 scope.go:117] "RemoveContainer" containerID="7f0890e578143a4ac9443173ff9c6d3c9de42bd3bc235b95df529824f2955319" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.129343 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.130601 4734 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097" exitCode=0 Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.130636 4734 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6" exitCode=0 Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.130648 4734 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818" exitCode=0 Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.130658 4734 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30" exitCode=2 Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.147289 4734 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.147482 4734 status_manager.go:851] "Failed to get status for pod" podUID="82e6bed1-c7a7-4b50-af1e-68415379c41e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.147649 4734 status_manager.go:851] "Failed to get status for pod" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" pod="openshift-marketplace/certified-operators-cbr82" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cbr82\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.147789 4734 status_manager.go:851] "Failed to get status for pod" podUID="15bf5615-0adc-46cd-8796-d419076acac7" pod="openshift-marketplace/community-operators-67l49" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-67l49\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.147933 4734 status_manager.go:851] "Failed to get status for pod" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" pod="openshift-marketplace/redhat-operators-vj45t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vj45t\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.148152 4734 status_manager.go:851] "Failed to get status for pod" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" pod="openshift-marketplace/redhat-operators-gnbkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gnbkp\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.154050 4734 status_manager.go:851] "Failed to get status for pod" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" pod="openshift-marketplace/redhat-operators-gnbkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gnbkp\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.154223 4734 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.154385 4734 status_manager.go:851] "Failed to get status for pod" podUID="15bf5615-0adc-46cd-8796-d419076acac7" pod="openshift-marketplace/community-operators-67l49" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-67l49\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.154559 4734 status_manager.go:851] "Failed to get status for pod" podUID="82e6bed1-c7a7-4b50-af1e-68415379c41e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.154739 4734 status_manager.go:851] "Failed to get status for pod" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" pod="openshift-marketplace/certified-operators-cbr82" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cbr82\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.154899 4734 status_manager.go:851] "Failed to get status for pod" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" pod="openshift-marketplace/redhat-operators-vj45t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vj45t\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.161008 4734 scope.go:117] "RemoveContainer" containerID="d5b8b8cb574d08e779955b6608ed0c030124c6ce2887ce6f86a9ec7d78f1b19e" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.192742 4734 scope.go:117] "RemoveContainer" containerID="bea406031d4e50c3113dd26dc4f848e9cc4e8b57d3bc63e09066d4009a36be34" Dec 05 23:23:31 crc kubenswrapper[4734]: E1205 23:23:31.193465 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bea406031d4e50c3113dd26dc4f848e9cc4e8b57d3bc63e09066d4009a36be34\": container with ID starting with bea406031d4e50c3113dd26dc4f848e9cc4e8b57d3bc63e09066d4009a36be34 not found: ID does not exist" containerID="bea406031d4e50c3113dd26dc4f848e9cc4e8b57d3bc63e09066d4009a36be34" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.193506 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bea406031d4e50c3113dd26dc4f848e9cc4e8b57d3bc63e09066d4009a36be34"} err="failed to get container status \"bea406031d4e50c3113dd26dc4f848e9cc4e8b57d3bc63e09066d4009a36be34\": rpc error: code = NotFound desc = could not find container \"bea406031d4e50c3113dd26dc4f848e9cc4e8b57d3bc63e09066d4009a36be34\": container with ID starting with bea406031d4e50c3113dd26dc4f848e9cc4e8b57d3bc63e09066d4009a36be34 not found: ID does not exist" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.193553 4734 scope.go:117] "RemoveContainer" containerID="7f0890e578143a4ac9443173ff9c6d3c9de42bd3bc235b95df529824f2955319" Dec 05 23:23:31 crc kubenswrapper[4734]: E1205 23:23:31.193899 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f0890e578143a4ac9443173ff9c6d3c9de42bd3bc235b95df529824f2955319\": container with ID starting with 7f0890e578143a4ac9443173ff9c6d3c9de42bd3bc235b95df529824f2955319 not found: ID does not exist" containerID="7f0890e578143a4ac9443173ff9c6d3c9de42bd3bc235b95df529824f2955319" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.193929 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f0890e578143a4ac9443173ff9c6d3c9de42bd3bc235b95df529824f2955319"} err="failed to get container status \"7f0890e578143a4ac9443173ff9c6d3c9de42bd3bc235b95df529824f2955319\": rpc error: code = NotFound desc = could not find container \"7f0890e578143a4ac9443173ff9c6d3c9de42bd3bc235b95df529824f2955319\": container with ID starting with 7f0890e578143a4ac9443173ff9c6d3c9de42bd3bc235b95df529824f2955319 not found: ID does not exist" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.193949 4734 scope.go:117] "RemoveContainer" containerID="d5b8b8cb574d08e779955b6608ed0c030124c6ce2887ce6f86a9ec7d78f1b19e" Dec 05 23:23:31 crc kubenswrapper[4734]: E1205 23:23:31.194286 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5b8b8cb574d08e779955b6608ed0c030124c6ce2887ce6f86a9ec7d78f1b19e\": container with ID starting with d5b8b8cb574d08e779955b6608ed0c030124c6ce2887ce6f86a9ec7d78f1b19e not found: ID does not exist" containerID="d5b8b8cb574d08e779955b6608ed0c030124c6ce2887ce6f86a9ec7d78f1b19e" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.194305 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5b8b8cb574d08e779955b6608ed0c030124c6ce2887ce6f86a9ec7d78f1b19e"} err="failed to get container status \"d5b8b8cb574d08e779955b6608ed0c030124c6ce2887ce6f86a9ec7d78f1b19e\": rpc error: code = NotFound desc = could not find container \"d5b8b8cb574d08e779955b6608ed0c030124c6ce2887ce6f86a9ec7d78f1b19e\": container with ID starting with d5b8b8cb574d08e779955b6608ed0c030124c6ce2887ce6f86a9ec7d78f1b19e not found: ID does not exist" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.194320 4734 scope.go:117] "RemoveContainer" containerID="83a62aec78d9ca65c2211b66e19eb5232b6b0a0b1aa4fe44bda308d235afaae3" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.209665 4734 scope.go:117] "RemoveContainer" containerID="9a9d481c6611f4c79cc0f91b17882461c5b0e02a7620eaef3f8402702c6c1dcb" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.226041 4734 scope.go:117] "RemoveContainer" containerID="a7c33c398efa5f848e79e010010315d4c638989f3a5d70bbd187d83da830a16a" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.242347 4734 scope.go:117] "RemoveContainer" containerID="83a62aec78d9ca65c2211b66e19eb5232b6b0a0b1aa4fe44bda308d235afaae3" Dec 05 23:23:31 crc kubenswrapper[4734]: E1205 23:23:31.242871 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83a62aec78d9ca65c2211b66e19eb5232b6b0a0b1aa4fe44bda308d235afaae3\": container with ID starting with 83a62aec78d9ca65c2211b66e19eb5232b6b0a0b1aa4fe44bda308d235afaae3 not found: ID does not exist" containerID="83a62aec78d9ca65c2211b66e19eb5232b6b0a0b1aa4fe44bda308d235afaae3" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.242910 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83a62aec78d9ca65c2211b66e19eb5232b6b0a0b1aa4fe44bda308d235afaae3"} err="failed to get container status \"83a62aec78d9ca65c2211b66e19eb5232b6b0a0b1aa4fe44bda308d235afaae3\": rpc error: code = NotFound desc = could not find container \"83a62aec78d9ca65c2211b66e19eb5232b6b0a0b1aa4fe44bda308d235afaae3\": container with ID starting with 83a62aec78d9ca65c2211b66e19eb5232b6b0a0b1aa4fe44bda308d235afaae3 not found: ID does not exist" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.242937 4734 scope.go:117] "RemoveContainer" containerID="9a9d481c6611f4c79cc0f91b17882461c5b0e02a7620eaef3f8402702c6c1dcb" Dec 05 23:23:31 crc kubenswrapper[4734]: E1205 23:23:31.243271 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a9d481c6611f4c79cc0f91b17882461c5b0e02a7620eaef3f8402702c6c1dcb\": container with ID starting with 9a9d481c6611f4c79cc0f91b17882461c5b0e02a7620eaef3f8402702c6c1dcb not found: ID does not exist" containerID="9a9d481c6611f4c79cc0f91b17882461c5b0e02a7620eaef3f8402702c6c1dcb" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.243297 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a9d481c6611f4c79cc0f91b17882461c5b0e02a7620eaef3f8402702c6c1dcb"} err="failed to get container status \"9a9d481c6611f4c79cc0f91b17882461c5b0e02a7620eaef3f8402702c6c1dcb\": rpc error: code = NotFound desc = could not find container \"9a9d481c6611f4c79cc0f91b17882461c5b0e02a7620eaef3f8402702c6c1dcb\": container with ID starting with 9a9d481c6611f4c79cc0f91b17882461c5b0e02a7620eaef3f8402702c6c1dcb not found: ID does not exist" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.243309 4734 scope.go:117] "RemoveContainer" containerID="a7c33c398efa5f848e79e010010315d4c638989f3a5d70bbd187d83da830a16a" Dec 05 23:23:31 crc kubenswrapper[4734]: E1205 23:23:31.243693 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7c33c398efa5f848e79e010010315d4c638989f3a5d70bbd187d83da830a16a\": container with ID starting with a7c33c398efa5f848e79e010010315d4c638989f3a5d70bbd187d83da830a16a not found: ID does not exist" containerID="a7c33c398efa5f848e79e010010315d4c638989f3a5d70bbd187d83da830a16a" Dec 05 23:23:31 crc kubenswrapper[4734]: I1205 23:23:31.243752 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7c33c398efa5f848e79e010010315d4c638989f3a5d70bbd187d83da830a16a"} err="failed to get container status \"a7c33c398efa5f848e79e010010315d4c638989f3a5d70bbd187d83da830a16a\": rpc error: code = NotFound desc = could not find container \"a7c33c398efa5f848e79e010010315d4c638989f3a5d70bbd187d83da830a16a\": container with ID starting with a7c33c398efa5f848e79e010010315d4c638989f3a5d70bbd187d83da830a16a not found: ID does not exist" Dec 05 23:23:32 crc kubenswrapper[4734]: I1205 23:23:32.493949 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 23:23:32 crc kubenswrapper[4734]: I1205 23:23:32.495689 4734 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:32 crc kubenswrapper[4734]: I1205 23:23:32.496229 4734 status_manager.go:851] "Failed to get status for pod" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" pod="openshift-marketplace/certified-operators-cbr82" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cbr82\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:32 crc kubenswrapper[4734]: I1205 23:23:32.496755 4734 status_manager.go:851] "Failed to get status for pod" podUID="15bf5615-0adc-46cd-8796-d419076acac7" pod="openshift-marketplace/community-operators-67l49" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-67l49\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:32 crc kubenswrapper[4734]: I1205 23:23:32.497174 4734 status_manager.go:851] "Failed to get status for pod" podUID="82e6bed1-c7a7-4b50-af1e-68415379c41e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:32 crc kubenswrapper[4734]: I1205 23:23:32.497691 4734 status_manager.go:851] "Failed to get status for pod" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" pod="openshift-marketplace/redhat-operators-vj45t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vj45t\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:32 crc kubenswrapper[4734]: I1205 23:23:32.498049 4734 status_manager.go:851] "Failed to get status for pod" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" pod="openshift-marketplace/redhat-operators-gnbkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gnbkp\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:32 crc kubenswrapper[4734]: I1205 23:23:32.670628 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/82e6bed1-c7a7-4b50-af1e-68415379c41e-kube-api-access\") pod \"82e6bed1-c7a7-4b50-af1e-68415379c41e\" (UID: \"82e6bed1-c7a7-4b50-af1e-68415379c41e\") " Dec 05 23:23:32 crc kubenswrapper[4734]: I1205 23:23:32.670694 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/82e6bed1-c7a7-4b50-af1e-68415379c41e-kubelet-dir\") pod \"82e6bed1-c7a7-4b50-af1e-68415379c41e\" (UID: \"82e6bed1-c7a7-4b50-af1e-68415379c41e\") " Dec 05 23:23:32 crc kubenswrapper[4734]: I1205 23:23:32.670795 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/82e6bed1-c7a7-4b50-af1e-68415379c41e-var-lock\") pod \"82e6bed1-c7a7-4b50-af1e-68415379c41e\" (UID: \"82e6bed1-c7a7-4b50-af1e-68415379c41e\") " Dec 05 23:23:32 crc kubenswrapper[4734]: I1205 23:23:32.670863 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82e6bed1-c7a7-4b50-af1e-68415379c41e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "82e6bed1-c7a7-4b50-af1e-68415379c41e" (UID: "82e6bed1-c7a7-4b50-af1e-68415379c41e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:23:32 crc kubenswrapper[4734]: I1205 23:23:32.671019 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82e6bed1-c7a7-4b50-af1e-68415379c41e-var-lock" (OuterVolumeSpecName: "var-lock") pod "82e6bed1-c7a7-4b50-af1e-68415379c41e" (UID: "82e6bed1-c7a7-4b50-af1e-68415379c41e"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:23:32 crc kubenswrapper[4734]: I1205 23:23:32.671567 4734 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/82e6bed1-c7a7-4b50-af1e-68415379c41e-var-lock\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:32 crc kubenswrapper[4734]: I1205 23:23:32.671596 4734 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/82e6bed1-c7a7-4b50-af1e-68415379c41e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:32 crc kubenswrapper[4734]: I1205 23:23:32.677337 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82e6bed1-c7a7-4b50-af1e-68415379c41e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "82e6bed1-c7a7-4b50-af1e-68415379c41e" (UID: "82e6bed1-c7a7-4b50-af1e-68415379c41e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:23:32 crc kubenswrapper[4734]: I1205 23:23:32.773581 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/82e6bed1-c7a7-4b50-af1e-68415379c41e-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:33 crc kubenswrapper[4734]: I1205 23:23:33.167291 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"82e6bed1-c7a7-4b50-af1e-68415379c41e","Type":"ContainerDied","Data":"52c1e9f2b300bb165aa1ec7c0f08c51d486e2f9e7e2de6959430710ea2c67d9d"} Dec 05 23:23:33 crc kubenswrapper[4734]: I1205 23:23:33.167853 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52c1e9f2b300bb165aa1ec7c0f08c51d486e2f9e7e2de6959430710ea2c67d9d" Dec 05 23:23:33 crc kubenswrapper[4734]: I1205 23:23:33.167367 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 23:23:33 crc kubenswrapper[4734]: I1205 23:23:33.185178 4734 status_manager.go:851] "Failed to get status for pod" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" pod="openshift-marketplace/redhat-operators-gnbkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gnbkp\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:33 crc kubenswrapper[4734]: I1205 23:23:33.185739 4734 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:33 crc kubenswrapper[4734]: I1205 23:23:33.186685 4734 status_manager.go:851] "Failed to get status for pod" podUID="82e6bed1-c7a7-4b50-af1e-68415379c41e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:33 crc kubenswrapper[4734]: I1205 23:23:33.186980 4734 status_manager.go:851] "Failed to get status for pod" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" pod="openshift-marketplace/certified-operators-cbr82" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cbr82\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:33 crc kubenswrapper[4734]: I1205 23:23:33.187342 4734 status_manager.go:851] "Failed to get status for pod" podUID="15bf5615-0adc-46cd-8796-d419076acac7" pod="openshift-marketplace/community-operators-67l49" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-67l49\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:33 crc kubenswrapper[4734]: I1205 23:23:33.187712 4734 status_manager.go:851] "Failed to get status for pod" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" pod="openshift-marketplace/redhat-operators-vj45t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vj45t\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:34 crc kubenswrapper[4734]: I1205 23:23:34.179449 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 23:23:34 crc kubenswrapper[4734]: I1205 23:23:34.182028 4734 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0" exitCode=0 Dec 05 23:23:34 crc kubenswrapper[4734]: E1205 23:23:34.216782 4734 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:34 crc kubenswrapper[4734]: E1205 23:23:34.217646 4734 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:34 crc kubenswrapper[4734]: E1205 23:23:34.218073 4734 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:34 crc kubenswrapper[4734]: E1205 23:23:34.218445 4734 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:34 crc kubenswrapper[4734]: E1205 23:23:34.218792 4734 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:34 crc kubenswrapper[4734]: I1205 23:23:34.218894 4734 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 05 23:23:34 crc kubenswrapper[4734]: E1205 23:23:34.219176 4734 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="200ms" Dec 05 23:23:34 crc kubenswrapper[4734]: E1205 23:23:34.420923 4734 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="400ms" Dec 05 23:23:34 crc kubenswrapper[4734]: I1205 23:23:34.646417 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 23:23:34 crc kubenswrapper[4734]: I1205 23:23:34.647510 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:23:34 crc kubenswrapper[4734]: I1205 23:23:34.648350 4734 status_manager.go:851] "Failed to get status for pod" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" pod="openshift-marketplace/redhat-operators-gnbkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gnbkp\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:34 crc kubenswrapper[4734]: I1205 23:23:34.648798 4734 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:34 crc kubenswrapper[4734]: I1205 23:23:34.649045 4734 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:34 crc kubenswrapper[4734]: I1205 23:23:34.649354 4734 status_manager.go:851] "Failed to get status for pod" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" pod="openshift-marketplace/certified-operators-cbr82" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cbr82\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:34 crc kubenswrapper[4734]: I1205 23:23:34.649889 4734 status_manager.go:851] "Failed to get status for pod" podUID="15bf5615-0adc-46cd-8796-d419076acac7" pod="openshift-marketplace/community-operators-67l49" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-67l49\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:34 crc kubenswrapper[4734]: I1205 23:23:34.650347 4734 status_manager.go:851] "Failed to get status for pod" podUID="82e6bed1-c7a7-4b50-af1e-68415379c41e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:34 crc kubenswrapper[4734]: I1205 23:23:34.650661 4734 status_manager.go:851] "Failed to get status for pod" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" pod="openshift-marketplace/redhat-operators-vj45t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vj45t\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:34 crc kubenswrapper[4734]: I1205 23:23:34.798092 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 23:23:34 crc kubenswrapper[4734]: I1205 23:23:34.798220 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 23:23:34 crc kubenswrapper[4734]: I1205 23:23:34.798301 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 23:23:34 crc kubenswrapper[4734]: I1205 23:23:34.798350 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:23:34 crc kubenswrapper[4734]: I1205 23:23:34.798415 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:23:34 crc kubenswrapper[4734]: I1205 23:23:34.798438 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:23:34 crc kubenswrapper[4734]: I1205 23:23:34.800236 4734 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:34 crc kubenswrapper[4734]: I1205 23:23:34.800264 4734 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:34 crc kubenswrapper[4734]: I1205 23:23:34.800275 4734 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:34 crc kubenswrapper[4734]: E1205 23:23:34.821551 4734 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="800ms" Dec 05 23:23:35 crc kubenswrapper[4734]: I1205 23:23:35.194587 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 23:23:35 crc kubenswrapper[4734]: I1205 23:23:35.195886 4734 scope.go:117] "RemoveContainer" containerID="a671afbbc9df0b998f3739224ef9149057fab165497610198788fc2c330e2097" Dec 05 23:23:35 crc kubenswrapper[4734]: I1205 23:23:35.195986 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:23:35 crc kubenswrapper[4734]: I1205 23:23:35.212318 4734 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:35 crc kubenswrapper[4734]: I1205 23:23:35.213194 4734 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:35 crc kubenswrapper[4734]: I1205 23:23:35.213745 4734 status_manager.go:851] "Failed to get status for pod" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" pod="openshift-marketplace/certified-operators-cbr82" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cbr82\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:35 crc kubenswrapper[4734]: I1205 23:23:35.214054 4734 status_manager.go:851] "Failed to get status for pod" podUID="15bf5615-0adc-46cd-8796-d419076acac7" pod="openshift-marketplace/community-operators-67l49" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-67l49\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:35 crc kubenswrapper[4734]: I1205 23:23:35.214319 4734 status_manager.go:851] "Failed to get status for pod" podUID="82e6bed1-c7a7-4b50-af1e-68415379c41e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:35 crc kubenswrapper[4734]: I1205 23:23:35.214562 4734 status_manager.go:851] "Failed to get status for pod" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" pod="openshift-marketplace/redhat-operators-vj45t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vj45t\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:35 crc kubenswrapper[4734]: I1205 23:23:35.214936 4734 scope.go:117] "RemoveContainer" containerID="b2bde0e0de9924009783b5c1583064d3fcd60604352556ba4e52c44c79e536b6" Dec 05 23:23:35 crc kubenswrapper[4734]: I1205 23:23:35.214958 4734 status_manager.go:851] "Failed to get status for pod" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" pod="openshift-marketplace/redhat-operators-gnbkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gnbkp\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:35 crc kubenswrapper[4734]: I1205 23:23:35.232152 4734 scope.go:117] "RemoveContainer" containerID="6312b8d5f4b62fafdc75dcada3960a3ee97a0bf8b5cdc36f2cc12edbc2ffd818" Dec 05 23:23:35 crc kubenswrapper[4734]: I1205 23:23:35.246350 4734 scope.go:117] "RemoveContainer" containerID="a095462763f7e81a1019219a4f4a843fbac80a2c15929599e2c57663a82bdd30" Dec 05 23:23:35 crc kubenswrapper[4734]: I1205 23:23:35.262160 4734 scope.go:117] "RemoveContainer" containerID="4fd6fffe2a0baf029a5d5aa589242755862c64c0074f4874cc558b4a7c7972b0" Dec 05 23:23:35 crc kubenswrapper[4734]: I1205 23:23:35.282692 4734 scope.go:117] "RemoveContainer" containerID="ff51a1ef13914624ddd31039d2efffba965b3201e335afd7852ebdaa64b6a48a" Dec 05 23:23:35 crc kubenswrapper[4734]: E1205 23:23:35.624481 4734 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="1.6s" Dec 05 23:23:35 crc kubenswrapper[4734]: I1205 23:23:35.628441 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 05 23:23:37 crc kubenswrapper[4734]: E1205 23:23:37.226914 4734 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="3.2s" Dec 05 23:23:39 crc kubenswrapper[4734]: I1205 23:23:39.617299 4734 status_manager.go:851] "Failed to get status for pod" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" pod="openshift-marketplace/redhat-operators-gnbkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gnbkp\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:39 crc kubenswrapper[4734]: I1205 23:23:39.618321 4734 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:39 crc kubenswrapper[4734]: I1205 23:23:39.618871 4734 status_manager.go:851] "Failed to get status for pod" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" pod="openshift-marketplace/certified-operators-cbr82" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cbr82\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:39 crc kubenswrapper[4734]: I1205 23:23:39.619387 4734 status_manager.go:851] "Failed to get status for pod" podUID="15bf5615-0adc-46cd-8796-d419076acac7" pod="openshift-marketplace/community-operators-67l49" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-67l49\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:39 crc kubenswrapper[4734]: I1205 23:23:39.619776 4734 status_manager.go:851] "Failed to get status for pod" podUID="82e6bed1-c7a7-4b50-af1e-68415379c41e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:39 crc kubenswrapper[4734]: I1205 23:23:39.620416 4734 status_manager.go:851] "Failed to get status for pod" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" pod="openshift-marketplace/redhat-operators-vj45t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vj45t\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:39 crc kubenswrapper[4734]: E1205 23:23:39.666459 4734 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.38:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" volumeName="registry-storage" Dec 05 23:23:40 crc kubenswrapper[4734]: E1205 23:23:40.428717 4734 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="6.4s" Dec 05 23:23:40 crc kubenswrapper[4734]: E1205 23:23:40.600007 4734 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.38:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e753d55319799 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 23:23:30.258581401 +0000 UTC m=+230.941985677,LastTimestamp:2025-12-05 23:23:30.258581401 +0000 UTC m=+230.941985677,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 23:23:42 crc kubenswrapper[4734]: I1205 23:23:42.613792 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:23:42 crc kubenswrapper[4734]: I1205 23:23:42.615413 4734 status_manager.go:851] "Failed to get status for pod" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" pod="openshift-marketplace/redhat-operators-gnbkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gnbkp\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:42 crc kubenswrapper[4734]: I1205 23:23:42.615925 4734 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:42 crc kubenswrapper[4734]: I1205 23:23:42.616449 4734 status_manager.go:851] "Failed to get status for pod" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" pod="openshift-marketplace/certified-operators-cbr82" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cbr82\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:42 crc kubenswrapper[4734]: I1205 23:23:42.616794 4734 status_manager.go:851] "Failed to get status for pod" podUID="15bf5615-0adc-46cd-8796-d419076acac7" pod="openshift-marketplace/community-operators-67l49" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-67l49\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:42 crc kubenswrapper[4734]: I1205 23:23:42.617104 4734 status_manager.go:851] "Failed to get status for pod" podUID="82e6bed1-c7a7-4b50-af1e-68415379c41e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:42 crc kubenswrapper[4734]: I1205 23:23:42.617338 4734 status_manager.go:851] "Failed to get status for pod" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" pod="openshift-marketplace/redhat-operators-vj45t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vj45t\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:42 crc kubenswrapper[4734]: I1205 23:23:42.629803 4734 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7414d8e5-13fa-40b1-b442-3ceee2425ee1" Dec 05 23:23:42 crc kubenswrapper[4734]: I1205 23:23:42.629830 4734 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7414d8e5-13fa-40b1-b442-3ceee2425ee1" Dec 05 23:23:42 crc kubenswrapper[4734]: E1205 23:23:42.630275 4734 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:23:42 crc kubenswrapper[4734]: I1205 23:23:42.630922 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:23:42 crc kubenswrapper[4734]: W1205 23:23:42.649358 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-788a799e176098647e0a3eec6542c3ae8978d0e73df45cd4b6065e4cfe272637 WatchSource:0}: Error finding container 788a799e176098647e0a3eec6542c3ae8978d0e73df45cd4b6065e4cfe272637: Status 404 returned error can't find the container with id 788a799e176098647e0a3eec6542c3ae8978d0e73df45cd4b6065e4cfe272637 Dec 05 23:23:43 crc kubenswrapper[4734]: I1205 23:23:43.245589 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"788a799e176098647e0a3eec6542c3ae8978d0e73df45cd4b6065e4cfe272637"} Dec 05 23:23:43 crc kubenswrapper[4734]: I1205 23:23:43.369030 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" podUID="e40087a5-cb18-4d2f-8e68-cc6e09c5bd87" containerName="oauth-openshift" containerID="cri-o://7ca3c043f4229050f39ccf624b9f967a42b32001cbf42d05b1942e0893eb4dbb" gracePeriod=15 Dec 05 23:23:44 crc kubenswrapper[4734]: I1205 23:23:44.475203 4734 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 05 23:23:44 crc kubenswrapper[4734]: I1205 23:23:44.475279 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 05 23:23:44 crc kubenswrapper[4734]: I1205 23:23:44.881500 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:23:44 crc kubenswrapper[4734]: I1205 23:23:44.883089 4734 status_manager.go:851] "Failed to get status for pod" podUID="e40087a5-cb18-4d2f-8e68-cc6e09c5bd87" pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-gxdpj\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:44 crc kubenswrapper[4734]: I1205 23:23:44.883582 4734 status_manager.go:851] "Failed to get status for pod" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" pod="openshift-marketplace/redhat-operators-vj45t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vj45t\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:44 crc kubenswrapper[4734]: I1205 23:23:44.883891 4734 status_manager.go:851] "Failed to get status for pod" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" pod="openshift-marketplace/redhat-operators-gnbkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gnbkp\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:44 crc kubenswrapper[4734]: I1205 23:23:44.884235 4734 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:44 crc kubenswrapper[4734]: I1205 23:23:44.884694 4734 status_manager.go:851] "Failed to get status for pod" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" pod="openshift-marketplace/certified-operators-cbr82" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cbr82\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:44 crc kubenswrapper[4734]: I1205 23:23:44.885392 4734 status_manager.go:851] "Failed to get status for pod" podUID="15bf5615-0adc-46cd-8796-d419076acac7" pod="openshift-marketplace/community-operators-67l49" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-67l49\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:44 crc kubenswrapper[4734]: I1205 23:23:44.885715 4734 status_manager.go:851] "Failed to get status for pod" podUID="82e6bed1-c7a7-4b50-af1e-68415379c41e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.042425 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-user-template-provider-selection\") pod \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.042516 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-ocp-branding-template\") pod \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.042585 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-cliconfig\") pod \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.042637 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-user-template-error\") pod \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.042687 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-user-idp-0-file-data\") pod \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.042767 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-service-ca\") pod \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.042808 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-router-certs\") pod \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.042861 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-user-template-login\") pod \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.042905 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-audit-policies\") pod \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.042939 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-trusted-ca-bundle\") pod \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.043005 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-serving-cert\") pod \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.043039 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-session\") pod \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.043079 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-audit-dir\") pod \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.044178 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "e40087a5-cb18-4d2f-8e68-cc6e09c5bd87" (UID: "e40087a5-cb18-4d2f-8e68-cc6e09c5bd87"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.044196 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "e40087a5-cb18-4d2f-8e68-cc6e09c5bd87" (UID: "e40087a5-cb18-4d2f-8e68-cc6e09c5bd87"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.044596 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "e40087a5-cb18-4d2f-8e68-cc6e09c5bd87" (UID: "e40087a5-cb18-4d2f-8e68-cc6e09c5bd87"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.044779 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6zcx\" (UniqueName: \"kubernetes.io/projected/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-kube-api-access-t6zcx\") pod \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\" (UID: \"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87\") " Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.045342 4734 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.045378 4734 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.045397 4734 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.045870 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "e40087a5-cb18-4d2f-8e68-cc6e09c5bd87" (UID: "e40087a5-cb18-4d2f-8e68-cc6e09c5bd87"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.046963 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "e40087a5-cb18-4d2f-8e68-cc6e09c5bd87" (UID: "e40087a5-cb18-4d2f-8e68-cc6e09c5bd87"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.051484 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "e40087a5-cb18-4d2f-8e68-cc6e09c5bd87" (UID: "e40087a5-cb18-4d2f-8e68-cc6e09c5bd87"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.051923 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "e40087a5-cb18-4d2f-8e68-cc6e09c5bd87" (UID: "e40087a5-cb18-4d2f-8e68-cc6e09c5bd87"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.052383 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "e40087a5-cb18-4d2f-8e68-cc6e09c5bd87" (UID: "e40087a5-cb18-4d2f-8e68-cc6e09c5bd87"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.053131 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "e40087a5-cb18-4d2f-8e68-cc6e09c5bd87" (UID: "e40087a5-cb18-4d2f-8e68-cc6e09c5bd87"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.053217 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "e40087a5-cb18-4d2f-8e68-cc6e09c5bd87" (UID: "e40087a5-cb18-4d2f-8e68-cc6e09c5bd87"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.053466 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "e40087a5-cb18-4d2f-8e68-cc6e09c5bd87" (UID: "e40087a5-cb18-4d2f-8e68-cc6e09c5bd87"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.053705 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "e40087a5-cb18-4d2f-8e68-cc6e09c5bd87" (UID: "e40087a5-cb18-4d2f-8e68-cc6e09c5bd87"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.054208 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-kube-api-access-t6zcx" (OuterVolumeSpecName: "kube-api-access-t6zcx") pod "e40087a5-cb18-4d2f-8e68-cc6e09c5bd87" (UID: "e40087a5-cb18-4d2f-8e68-cc6e09c5bd87"). InnerVolumeSpecName "kube-api-access-t6zcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.054292 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "e40087a5-cb18-4d2f-8e68-cc6e09c5bd87" (UID: "e40087a5-cb18-4d2f-8e68-cc6e09c5bd87"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.146940 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6zcx\" (UniqueName: \"kubernetes.io/projected/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-kube-api-access-t6zcx\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.146988 4734 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.147001 4734 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.147014 4734 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.147026 4734 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.147037 4734 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.147046 4734 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.147057 4734 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.147067 4734 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.147076 4734 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.147087 4734 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.261185 4734 generic.go:334] "Generic (PLEG): container finished" podID="e40087a5-cb18-4d2f-8e68-cc6e09c5bd87" containerID="7ca3c043f4229050f39ccf624b9f967a42b32001cbf42d05b1942e0893eb4dbb" exitCode=0 Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.261309 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" event={"ID":"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87","Type":"ContainerDied","Data":"7ca3c043f4229050f39ccf624b9f967a42b32001cbf42d05b1942e0893eb4dbb"} Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.261346 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" event={"ID":"e40087a5-cb18-4d2f-8e68-cc6e09c5bd87","Type":"ContainerDied","Data":"d39df6290a6f17ad114964be904d291712beddea87d7322887a8f796551773fd"} Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.261347 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.261368 4734 scope.go:117] "RemoveContainer" containerID="7ca3c043f4229050f39ccf624b9f967a42b32001cbf42d05b1942e0893eb4dbb" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.262576 4734 status_manager.go:851] "Failed to get status for pod" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" pod="openshift-marketplace/redhat-operators-gnbkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gnbkp\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.263051 4734 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.263451 4734 status_manager.go:851] "Failed to get status for pod" podUID="82e6bed1-c7a7-4b50-af1e-68415379c41e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.263687 4734 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="3274bb3962a1004535439aec5d506e297863d35be7c8fdfdf56c2e23efea908d" exitCode=0 Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.263753 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"3274bb3962a1004535439aec5d506e297863d35be7c8fdfdf56c2e23efea908d"} Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.264128 4734 status_manager.go:851] "Failed to get status for pod" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" pod="openshift-marketplace/certified-operators-cbr82" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cbr82\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.264388 4734 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7414d8e5-13fa-40b1-b442-3ceee2425ee1" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.264460 4734 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7414d8e5-13fa-40b1-b442-3ceee2425ee1" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.264448 4734 status_manager.go:851] "Failed to get status for pod" podUID="15bf5615-0adc-46cd-8796-d419076acac7" pod="openshift-marketplace/community-operators-67l49" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-67l49\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: E1205 23:23:45.265309 4734 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.265355 4734 status_manager.go:851] "Failed to get status for pod" podUID="e40087a5-cb18-4d2f-8e68-cc6e09c5bd87" pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-gxdpj\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.266267 4734 status_manager.go:851] "Failed to get status for pod" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" pod="openshift-marketplace/redhat-operators-vj45t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vj45t\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.267027 4734 status_manager.go:851] "Failed to get status for pod" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" pod="openshift-marketplace/redhat-operators-vj45t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vj45t\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.267492 4734 status_manager.go:851] "Failed to get status for pod" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" pod="openshift-marketplace/redhat-operators-gnbkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gnbkp\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.267979 4734 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.268415 4734 status_manager.go:851] "Failed to get status for pod" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" pod="openshift-marketplace/certified-operators-cbr82" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cbr82\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.268935 4734 status_manager.go:851] "Failed to get status for pod" podUID="15bf5615-0adc-46cd-8796-d419076acac7" pod="openshift-marketplace/community-operators-67l49" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-67l49\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.269417 4734 status_manager.go:851] "Failed to get status for pod" podUID="82e6bed1-c7a7-4b50-af1e-68415379c41e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.270132 4734 status_manager.go:851] "Failed to get status for pod" podUID="e40087a5-cb18-4d2f-8e68-cc6e09c5bd87" pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-gxdpj\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.270828 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.270901 4734 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7" exitCode=1 Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.270944 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7"} Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.271469 4734 scope.go:117] "RemoveContainer" containerID="c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.271885 4734 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.272896 4734 status_manager.go:851] "Failed to get status for pod" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" pod="openshift-marketplace/redhat-operators-gnbkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gnbkp\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.273257 4734 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.273860 4734 status_manager.go:851] "Failed to get status for pod" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" pod="openshift-marketplace/certified-operators-cbr82" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cbr82\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.274700 4734 status_manager.go:851] "Failed to get status for pod" podUID="15bf5615-0adc-46cd-8796-d419076acac7" pod="openshift-marketplace/community-operators-67l49" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-67l49\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.276079 4734 status_manager.go:851] "Failed to get status for pod" podUID="82e6bed1-c7a7-4b50-af1e-68415379c41e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.277070 4734 status_manager.go:851] "Failed to get status for pod" podUID="e40087a5-cb18-4d2f-8e68-cc6e09c5bd87" pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-gxdpj\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.277917 4734 status_manager.go:851] "Failed to get status for pod" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" pod="openshift-marketplace/redhat-operators-vj45t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vj45t\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.286331 4734 status_manager.go:851] "Failed to get status for pod" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" pod="openshift-marketplace/redhat-operators-vj45t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vj45t\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.286777 4734 scope.go:117] "RemoveContainer" containerID="7ca3c043f4229050f39ccf624b9f967a42b32001cbf42d05b1942e0893eb4dbb" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.286976 4734 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: E1205 23:23:45.287423 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ca3c043f4229050f39ccf624b9f967a42b32001cbf42d05b1942e0893eb4dbb\": container with ID starting with 7ca3c043f4229050f39ccf624b9f967a42b32001cbf42d05b1942e0893eb4dbb not found: ID does not exist" containerID="7ca3c043f4229050f39ccf624b9f967a42b32001cbf42d05b1942e0893eb4dbb" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.287505 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ca3c043f4229050f39ccf624b9f967a42b32001cbf42d05b1942e0893eb4dbb"} err="failed to get container status \"7ca3c043f4229050f39ccf624b9f967a42b32001cbf42d05b1942e0893eb4dbb\": rpc error: code = NotFound desc = could not find container \"7ca3c043f4229050f39ccf624b9f967a42b32001cbf42d05b1942e0893eb4dbb\": container with ID starting with 7ca3c043f4229050f39ccf624b9f967a42b32001cbf42d05b1942e0893eb4dbb not found: ID does not exist" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.287422 4734 status_manager.go:851] "Failed to get status for pod" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" pod="openshift-marketplace/redhat-operators-gnbkp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-gnbkp\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.288074 4734 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.288625 4734 status_manager.go:851] "Failed to get status for pod" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" pod="openshift-marketplace/certified-operators-cbr82" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-cbr82\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.289185 4734 status_manager.go:851] "Failed to get status for pod" podUID="15bf5615-0adc-46cd-8796-d419076acac7" pod="openshift-marketplace/community-operators-67l49" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-67l49\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.289741 4734 status_manager.go:851] "Failed to get status for pod" podUID="82e6bed1-c7a7-4b50-af1e-68415379c41e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:45 crc kubenswrapper[4734]: I1205 23:23:45.290211 4734 status_manager.go:851] "Failed to get status for pod" podUID="e40087a5-cb18-4d2f-8e68-cc6e09c5bd87" pod="openshift-authentication/oauth-openshift-558db77b4-gxdpj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-gxdpj\": dial tcp 38.102.83.38:6443: connect: connection refused" Dec 05 23:23:46 crc kubenswrapper[4734]: I1205 23:23:46.282731 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5e1ada752228bc8fbe4430aed340a66fb7ec49a18844da3c3a60888f4d756f40"} Dec 05 23:23:46 crc kubenswrapper[4734]: I1205 23:23:46.283176 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"30c1efcea3096e8b515260b6f36710801e73f8db14f4ed8f95f75154aa855a44"} Dec 05 23:23:46 crc kubenswrapper[4734]: I1205 23:23:46.283189 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0005011dad6af23b5e1efc9a837eceaee508619243b14c286382adb2abe98668"} Dec 05 23:23:46 crc kubenswrapper[4734]: I1205 23:23:46.287797 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 23:23:46 crc kubenswrapper[4734]: I1205 23:23:46.287867 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"84c8b1cd7aeab417c9fc285bc82649da2e0052260590a1a89126a789ceb1d32f"} Dec 05 23:23:46 crc kubenswrapper[4734]: I1205 23:23:46.959591 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 23:23:46 crc kubenswrapper[4734]: I1205 23:23:46.960379 4734 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 05 23:23:46 crc kubenswrapper[4734]: I1205 23:23:46.960594 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 05 23:23:47 crc kubenswrapper[4734]: I1205 23:23:47.302704 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"33a34527ff78e01637cac00d2633b26abfc78bea4660489e723ce8ae0c5f85e8"} Dec 05 23:23:47 crc kubenswrapper[4734]: I1205 23:23:47.302766 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b5f59f806841461272ac93801a73ea2194767853fd7835c3c56fea7955cedcfb"} Dec 05 23:23:47 crc kubenswrapper[4734]: I1205 23:23:47.303919 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:23:47 crc kubenswrapper[4734]: I1205 23:23:47.304056 4734 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7414d8e5-13fa-40b1-b442-3ceee2425ee1" Dec 05 23:23:47 crc kubenswrapper[4734]: I1205 23:23:47.304131 4734 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7414d8e5-13fa-40b1-b442-3ceee2425ee1" Dec 05 23:23:47 crc kubenswrapper[4734]: I1205 23:23:47.631418 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:23:47 crc kubenswrapper[4734]: I1205 23:23:47.631690 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:23:47 crc kubenswrapper[4734]: I1205 23:23:47.636360 4734 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]log ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]etcd ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/openshift.io-api-request-count-filter ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/openshift.io-startkubeinformers ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/priority-and-fairness-config-consumer ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/priority-and-fairness-filter ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/start-apiextensions-informers ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/start-apiextensions-controllers ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/crd-informer-synced ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/start-system-namespaces-controller ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/start-cluster-authentication-info-controller ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/start-legacy-token-tracking-controller ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/start-service-ip-repair-controllers ok Dec 05 23:23:47 crc kubenswrapper[4734]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Dec 05 23:23:47 crc kubenswrapper[4734]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/priority-and-fairness-config-producer ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/bootstrap-controller ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/start-kube-aggregator-informers ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/apiservice-status-local-available-controller ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/apiservice-status-remote-available-controller ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/apiservice-registration-controller ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/apiservice-wait-for-first-sync ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/apiservice-discovery-controller ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/kube-apiserver-autoregistration ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]autoregister-completion ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/apiservice-openapi-controller ok Dec 05 23:23:47 crc kubenswrapper[4734]: [+]poststarthook/apiservice-openapiv3-controller ok Dec 05 23:23:47 crc kubenswrapper[4734]: livez check failed Dec 05 23:23:47 crc kubenswrapper[4734]: I1205 23:23:47.636414 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 23:23:48 crc kubenswrapper[4734]: I1205 23:23:48.160911 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 23:23:52 crc kubenswrapper[4734]: I1205 23:23:52.315929 4734 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:23:52 crc kubenswrapper[4734]: I1205 23:23:52.639305 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:23:52 crc kubenswrapper[4734]: I1205 23:23:52.644339 4734 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="80bf24ff-c987-4c6f-9820-67d0807733be" Dec 05 23:23:53 crc kubenswrapper[4734]: I1205 23:23:53.340423 4734 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7414d8e5-13fa-40b1-b442-3ceee2425ee1" Dec 05 23:23:53 crc kubenswrapper[4734]: I1205 23:23:53.340454 4734 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7414d8e5-13fa-40b1-b442-3ceee2425ee1" Dec 05 23:23:53 crc kubenswrapper[4734]: I1205 23:23:53.345550 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:23:54 crc kubenswrapper[4734]: I1205 23:23:54.347968 4734 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7414d8e5-13fa-40b1-b442-3ceee2425ee1" Dec 05 23:23:54 crc kubenswrapper[4734]: I1205 23:23:54.348023 4734 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7414d8e5-13fa-40b1-b442-3ceee2425ee1" Dec 05 23:23:56 crc kubenswrapper[4734]: I1205 23:23:56.959819 4734 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 05 23:23:56 crc kubenswrapper[4734]: I1205 23:23:56.960439 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 05 23:23:59 crc kubenswrapper[4734]: I1205 23:23:59.654913 4734 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="80bf24ff-c987-4c6f-9820-67d0807733be" Dec 05 23:24:01 crc kubenswrapper[4734]: I1205 23:24:01.360298 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 23:24:01 crc kubenswrapper[4734]: I1205 23:24:01.455646 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 05 23:24:01 crc kubenswrapper[4734]: I1205 23:24:01.854273 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 05 23:24:02 crc kubenswrapper[4734]: I1205 23:24:02.272356 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 05 23:24:03 crc kubenswrapper[4734]: I1205 23:24:03.310765 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 05 23:24:03 crc kubenswrapper[4734]: I1205 23:24:03.313997 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 05 23:24:03 crc kubenswrapper[4734]: I1205 23:24:03.404570 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 05 23:24:03 crc kubenswrapper[4734]: I1205 23:24:03.597331 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 23:24:03 crc kubenswrapper[4734]: I1205 23:24:03.801078 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 05 23:24:04 crc kubenswrapper[4734]: I1205 23:24:04.324502 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 05 23:24:04 crc kubenswrapper[4734]: I1205 23:24:04.434062 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 05 23:24:04 crc kubenswrapper[4734]: I1205 23:24:04.463947 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 05 23:24:04 crc kubenswrapper[4734]: I1205 23:24:04.696252 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 05 23:24:04 crc kubenswrapper[4734]: I1205 23:24:04.707309 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 05 23:24:04 crc kubenswrapper[4734]: I1205 23:24:04.791380 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 05 23:24:04 crc kubenswrapper[4734]: I1205 23:24:04.875016 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 05 23:24:04 crc kubenswrapper[4734]: I1205 23:24:04.993241 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 05 23:24:05 crc kubenswrapper[4734]: I1205 23:24:05.052026 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 05 23:24:05 crc kubenswrapper[4734]: I1205 23:24:05.093990 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 05 23:24:05 crc kubenswrapper[4734]: I1205 23:24:05.186391 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 05 23:24:05 crc kubenswrapper[4734]: I1205 23:24:05.222022 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 05 23:24:05 crc kubenswrapper[4734]: I1205 23:24:05.278498 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 05 23:24:05 crc kubenswrapper[4734]: I1205 23:24:05.320753 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 05 23:24:05 crc kubenswrapper[4734]: I1205 23:24:05.353454 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 05 23:24:05 crc kubenswrapper[4734]: I1205 23:24:05.355642 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 05 23:24:05 crc kubenswrapper[4734]: I1205 23:24:05.372084 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 05 23:24:05 crc kubenswrapper[4734]: I1205 23:24:05.465922 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 05 23:24:05 crc kubenswrapper[4734]: I1205 23:24:05.466975 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 05 23:24:05 crc kubenswrapper[4734]: I1205 23:24:05.486940 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 05 23:24:05 crc kubenswrapper[4734]: I1205 23:24:05.580899 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 05 23:24:05 crc kubenswrapper[4734]: I1205 23:24:05.622733 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 23:24:05 crc kubenswrapper[4734]: I1205 23:24:05.665609 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 23:24:05 crc kubenswrapper[4734]: I1205 23:24:05.725826 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 05 23:24:05 crc kubenswrapper[4734]: I1205 23:24:05.869420 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 05 23:24:05 crc kubenswrapper[4734]: I1205 23:24:05.932805 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 05 23:24:06 crc kubenswrapper[4734]: I1205 23:24:06.006245 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 05 23:24:06 crc kubenswrapper[4734]: I1205 23:24:06.145889 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 05 23:24:06 crc kubenswrapper[4734]: I1205 23:24:06.316676 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 05 23:24:06 crc kubenswrapper[4734]: I1205 23:24:06.323318 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 05 23:24:06 crc kubenswrapper[4734]: I1205 23:24:06.366260 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 23:24:06 crc kubenswrapper[4734]: I1205 23:24:06.367426 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 05 23:24:06 crc kubenswrapper[4734]: I1205 23:24:06.386404 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 05 23:24:06 crc kubenswrapper[4734]: I1205 23:24:06.444567 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 05 23:24:06 crc kubenswrapper[4734]: I1205 23:24:06.511177 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 05 23:24:06 crc kubenswrapper[4734]: I1205 23:24:06.632875 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 05 23:24:06 crc kubenswrapper[4734]: I1205 23:24:06.680710 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 05 23:24:06 crc kubenswrapper[4734]: I1205 23:24:06.692553 4734 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 05 23:24:06 crc kubenswrapper[4734]: I1205 23:24:06.719823 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 05 23:24:06 crc kubenswrapper[4734]: I1205 23:24:06.731863 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 05 23:24:06 crc kubenswrapper[4734]: I1205 23:24:06.850879 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 23:24:06 crc kubenswrapper[4734]: I1205 23:24:06.854353 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 05 23:24:06 crc kubenswrapper[4734]: I1205 23:24:06.868743 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 05 23:24:06 crc kubenswrapper[4734]: I1205 23:24:06.927872 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 05 23:24:06 crc kubenswrapper[4734]: I1205 23:24:06.928970 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 05 23:24:06 crc kubenswrapper[4734]: I1205 23:24:06.961073 4734 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 05 23:24:06 crc kubenswrapper[4734]: I1205 23:24:06.961160 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 05 23:24:06 crc kubenswrapper[4734]: I1205 23:24:06.961225 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 23:24:06 crc kubenswrapper[4734]: I1205 23:24:06.961809 4734 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"84c8b1cd7aeab417c9fc285bc82649da2e0052260590a1a89126a789ceb1d32f"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 05 23:24:06 crc kubenswrapper[4734]: I1205 23:24:06.961915 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://84c8b1cd7aeab417c9fc285bc82649da2e0052260590a1a89126a789ceb1d32f" gracePeriod=30 Dec 05 23:24:06 crc kubenswrapper[4734]: I1205 23:24:06.968558 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 05 23:24:07 crc kubenswrapper[4734]: I1205 23:24:07.008272 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 05 23:24:07 crc kubenswrapper[4734]: I1205 23:24:07.025070 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 05 23:24:07 crc kubenswrapper[4734]: I1205 23:24:07.052067 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 05 23:24:07 crc kubenswrapper[4734]: I1205 23:24:07.097252 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 05 23:24:07 crc kubenswrapper[4734]: I1205 23:24:07.183721 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 05 23:24:07 crc kubenswrapper[4734]: I1205 23:24:07.217575 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 05 23:24:07 crc kubenswrapper[4734]: I1205 23:24:07.236975 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 05 23:24:07 crc kubenswrapper[4734]: I1205 23:24:07.239354 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 05 23:24:07 crc kubenswrapper[4734]: I1205 23:24:07.260130 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 05 23:24:07 crc kubenswrapper[4734]: I1205 23:24:07.306737 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 23:24:07 crc kubenswrapper[4734]: I1205 23:24:07.319109 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 05 23:24:07 crc kubenswrapper[4734]: I1205 23:24:07.332646 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 05 23:24:07 crc kubenswrapper[4734]: I1205 23:24:07.368888 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 05 23:24:07 crc kubenswrapper[4734]: I1205 23:24:07.428219 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 05 23:24:07 crc kubenswrapper[4734]: I1205 23:24:07.430927 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 05 23:24:07 crc kubenswrapper[4734]: I1205 23:24:07.471890 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 05 23:24:07 crc kubenswrapper[4734]: I1205 23:24:07.476702 4734 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 05 23:24:07 crc kubenswrapper[4734]: I1205 23:24:07.551810 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 05 23:24:07 crc kubenswrapper[4734]: I1205 23:24:07.582500 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 05 23:24:07 crc kubenswrapper[4734]: I1205 23:24:07.611787 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 05 23:24:07 crc kubenswrapper[4734]: I1205 23:24:07.662342 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 05 23:24:07 crc kubenswrapper[4734]: I1205 23:24:07.683741 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 05 23:24:07 crc kubenswrapper[4734]: I1205 23:24:07.686447 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 05 23:24:07 crc kubenswrapper[4734]: I1205 23:24:07.735764 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 05 23:24:07 crc kubenswrapper[4734]: I1205 23:24:07.808828 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 05 23:24:07 crc kubenswrapper[4734]: I1205 23:24:07.914051 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 05 23:24:07 crc kubenswrapper[4734]: I1205 23:24:07.931107 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 05 23:24:08 crc kubenswrapper[4734]: I1205 23:24:08.046974 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 05 23:24:08 crc kubenswrapper[4734]: I1205 23:24:08.251743 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 05 23:24:08 crc kubenswrapper[4734]: I1205 23:24:08.286473 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 05 23:24:08 crc kubenswrapper[4734]: I1205 23:24:08.302182 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 05 23:24:08 crc kubenswrapper[4734]: I1205 23:24:08.362552 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 23:24:08 crc kubenswrapper[4734]: I1205 23:24:08.483862 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 05 23:24:08 crc kubenswrapper[4734]: I1205 23:24:08.486022 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 05 23:24:08 crc kubenswrapper[4734]: I1205 23:24:08.524888 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 05 23:24:08 crc kubenswrapper[4734]: I1205 23:24:08.527126 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 05 23:24:08 crc kubenswrapper[4734]: I1205 23:24:08.634944 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 05 23:24:08 crc kubenswrapper[4734]: I1205 23:24:08.669510 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 05 23:24:08 crc kubenswrapper[4734]: I1205 23:24:08.729206 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 05 23:24:08 crc kubenswrapper[4734]: I1205 23:24:08.798549 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 23:24:08 crc kubenswrapper[4734]: I1205 23:24:08.815940 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 23:24:09 crc kubenswrapper[4734]: I1205 23:24:09.083653 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 05 23:24:09 crc kubenswrapper[4734]: I1205 23:24:09.131483 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 05 23:24:09 crc kubenswrapper[4734]: I1205 23:24:09.206979 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 05 23:24:09 crc kubenswrapper[4734]: I1205 23:24:09.314795 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 05 23:24:09 crc kubenswrapper[4734]: I1205 23:24:09.334766 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 05 23:24:09 crc kubenswrapper[4734]: I1205 23:24:09.346374 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 23:24:09 crc kubenswrapper[4734]: I1205 23:24:09.358339 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 23:24:09 crc kubenswrapper[4734]: I1205 23:24:09.374988 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 05 23:24:09 crc kubenswrapper[4734]: I1205 23:24:09.444421 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 05 23:24:09 crc kubenswrapper[4734]: I1205 23:24:09.475775 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 05 23:24:09 crc kubenswrapper[4734]: I1205 23:24:09.483937 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 05 23:24:09 crc kubenswrapper[4734]: I1205 23:24:09.511028 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 05 23:24:09 crc kubenswrapper[4734]: I1205 23:24:09.686846 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 23:24:09 crc kubenswrapper[4734]: I1205 23:24:09.698628 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 23:24:09 crc kubenswrapper[4734]: I1205 23:24:09.847674 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.050970 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.077471 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.145460 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.267697 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.317921 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.388825 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.617964 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.619244 4734 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.624081 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=41.624059688 podStartE2EDuration="41.624059688s" podCreationTimestamp="2025-12-05 23:23:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:23:51.909900579 +0000 UTC m=+252.593304855" watchObservedRunningTime="2025-12-05 23:24:10.624059688 +0000 UTC m=+271.307463964" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.624957 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-67l49","openshift-marketplace/certified-operators-cbr82","openshift-authentication/oauth-openshift-558db77b4-gxdpj","openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.625029 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7dc5844c99-8ln7q","openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 23:24:10 crc kubenswrapper[4734]: E1205 23:24:10.625343 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82e6bed1-c7a7-4b50-af1e-68415379c41e" containerName="installer" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.625364 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="82e6bed1-c7a7-4b50-af1e-68415379c41e" containerName="installer" Dec 05 23:24:10 crc kubenswrapper[4734]: E1205 23:24:10.625381 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" containerName="extract-content" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.625390 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" containerName="extract-content" Dec 05 23:24:10 crc kubenswrapper[4734]: E1205 23:24:10.625400 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" containerName="registry-server" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.625406 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" containerName="registry-server" Dec 05 23:24:10 crc kubenswrapper[4734]: E1205 23:24:10.625422 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15bf5615-0adc-46cd-8796-d419076acac7" containerName="extract-content" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.625428 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="15bf5615-0adc-46cd-8796-d419076acac7" containerName="extract-content" Dec 05 23:24:10 crc kubenswrapper[4734]: E1205 23:24:10.625441 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e40087a5-cb18-4d2f-8e68-cc6e09c5bd87" containerName="oauth-openshift" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.625447 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="e40087a5-cb18-4d2f-8e68-cc6e09c5bd87" containerName="oauth-openshift" Dec 05 23:24:10 crc kubenswrapper[4734]: E1205 23:24:10.625462 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15bf5615-0adc-46cd-8796-d419076acac7" containerName="registry-server" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.625470 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="15bf5615-0adc-46cd-8796-d419076acac7" containerName="registry-server" Dec 05 23:24:10 crc kubenswrapper[4734]: E1205 23:24:10.625481 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" containerName="extract-utilities" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.625488 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" containerName="extract-utilities" Dec 05 23:24:10 crc kubenswrapper[4734]: E1205 23:24:10.625499 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15bf5615-0adc-46cd-8796-d419076acac7" containerName="extract-utilities" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.625505 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="15bf5615-0adc-46cd-8796-d419076acac7" containerName="extract-utilities" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.625645 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="e40087a5-cb18-4d2f-8e68-cc6e09c5bd87" containerName="oauth-openshift" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.625662 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="15bf5615-0adc-46cd-8796-d419076acac7" containerName="registry-server" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.625709 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="82e6bed1-c7a7-4b50-af1e-68415379c41e" containerName="installer" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.625720 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" containerName="registry-server" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.625719 4734 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7414d8e5-13fa-40b1-b442-3ceee2425ee1" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.625770 4734 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7414d8e5-13fa-40b1-b442-3ceee2425ee1" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.626282 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.629320 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.629742 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.630068 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.630647 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.630866 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.631952 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.632418 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.633074 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.633383 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.633596 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.633787 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.637010 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.638324 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.648120 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.648834 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.649504 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.658105 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=18.658082378 podStartE2EDuration="18.658082378s" podCreationTimestamp="2025-12-05 23:23:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:24:10.653686172 +0000 UTC m=+271.337090448" watchObservedRunningTime="2025-12-05 23:24:10.658082378 +0000 UTC m=+271.341486654" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.659068 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.723632 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.763494 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.809658 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-system-router-certs\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.809720 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.809754 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-audit-policies\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.809785 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.809922 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jkpt\" (UniqueName: \"kubernetes.io/projected/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-kube-api-access-5jkpt\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.809966 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.810006 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-user-template-error\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.810030 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.810048 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.810080 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-system-service-ca\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.810107 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.810135 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-audit-dir\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.810167 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-user-template-login\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.810279 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-system-session\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.811662 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.909476 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.911210 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.911567 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-audit-policies\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.911609 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.911648 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jkpt\" (UniqueName: \"kubernetes.io/projected/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-kube-api-access-5jkpt\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.911689 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.912864 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.911756 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-user-template-error\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.912940 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.912959 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.912983 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-system-service-ca\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.913003 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.913033 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-audit-dir\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.913056 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-user-template-login\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.913074 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-system-session\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.913093 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-system-router-certs\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.913111 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.913550 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-audit-dir\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.913894 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.914260 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-system-service-ca\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.914402 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-audit-policies\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.919000 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-user-template-error\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.919083 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.919284 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.919383 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.920143 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-user-template-login\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.920404 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.920880 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-system-session\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.921854 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-v4-0-config-system-router-certs\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.933300 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jkpt\" (UniqueName: \"kubernetes.io/projected/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175-kube-api-access-5jkpt\") pod \"oauth-openshift-7dc5844c99-8ln7q\" (UID: \"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.958377 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:10 crc kubenswrapper[4734]: I1205 23:24:10.977729 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 05 23:24:11 crc kubenswrapper[4734]: I1205 23:24:10.996608 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 05 23:24:11 crc kubenswrapper[4734]: I1205 23:24:11.045373 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 05 23:24:11 crc kubenswrapper[4734]: I1205 23:24:11.114215 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 05 23:24:11 crc kubenswrapper[4734]: I1205 23:24:11.158972 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 05 23:24:11 crc kubenswrapper[4734]: I1205 23:24:11.218623 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 05 23:24:11 crc kubenswrapper[4734]: I1205 23:24:11.219209 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 05 23:24:11 crc kubenswrapper[4734]: I1205 23:24:11.237923 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 05 23:24:11 crc kubenswrapper[4734]: I1205 23:24:11.282284 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 05 23:24:11 crc kubenswrapper[4734]: I1205 23:24:11.421872 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 05 23:24:11 crc kubenswrapper[4734]: I1205 23:24:11.492745 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 05 23:24:11 crc kubenswrapper[4734]: I1205 23:24:11.508345 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 05 23:24:11 crc kubenswrapper[4734]: I1205 23:24:11.520842 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 05 23:24:11 crc kubenswrapper[4734]: I1205 23:24:11.530882 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 05 23:24:11 crc kubenswrapper[4734]: I1205 23:24:11.621726 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15bf5615-0adc-46cd-8796-d419076acac7" path="/var/lib/kubelet/pods/15bf5615-0adc-46cd-8796-d419076acac7/volumes" Dec 05 23:24:11 crc kubenswrapper[4734]: I1205 23:24:11.622601 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e40087a5-cb18-4d2f-8e68-cc6e09c5bd87" path="/var/lib/kubelet/pods/e40087a5-cb18-4d2f-8e68-cc6e09c5bd87/volumes" Dec 05 23:24:11 crc kubenswrapper[4734]: I1205 23:24:11.623063 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 05 23:24:11 crc kubenswrapper[4734]: I1205 23:24:11.623105 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6e9b180-8bc8-4f84-b1f7-ec822b6f6560" path="/var/lib/kubelet/pods/e6e9b180-8bc8-4f84-b1f7-ec822b6f6560/volumes" Dec 05 23:24:11 crc kubenswrapper[4734]: I1205 23:24:11.672828 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 05 23:24:11 crc kubenswrapper[4734]: I1205 23:24:11.697407 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 05 23:24:11 crc kubenswrapper[4734]: I1205 23:24:11.770190 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 05 23:24:11 crc kubenswrapper[4734]: I1205 23:24:11.816086 4734 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 05 23:24:11 crc kubenswrapper[4734]: I1205 23:24:11.846662 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 05 23:24:11 crc kubenswrapper[4734]: I1205 23:24:11.935358 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 05 23:24:11 crc kubenswrapper[4734]: I1205 23:24:11.980959 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 05 23:24:12 crc kubenswrapper[4734]: I1205 23:24:12.002076 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 05 23:24:12 crc kubenswrapper[4734]: I1205 23:24:12.023617 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 05 23:24:12 crc kubenswrapper[4734]: I1205 23:24:12.032317 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 05 23:24:12 crc kubenswrapper[4734]: I1205 23:24:12.101753 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 05 23:24:12 crc kubenswrapper[4734]: I1205 23:24:12.128649 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 05 23:24:12 crc kubenswrapper[4734]: I1205 23:24:12.162460 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 05 23:24:12 crc kubenswrapper[4734]: I1205 23:24:12.175770 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 05 23:24:12 crc kubenswrapper[4734]: I1205 23:24:12.209321 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 05 23:24:12 crc kubenswrapper[4734]: I1205 23:24:12.285083 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 05 23:24:12 crc kubenswrapper[4734]: I1205 23:24:12.309015 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 05 23:24:12 crc kubenswrapper[4734]: I1205 23:24:12.335933 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 05 23:24:12 crc kubenswrapper[4734]: I1205 23:24:12.387590 4734 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 05 23:24:12 crc kubenswrapper[4734]: I1205 23:24:12.397716 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 05 23:24:12 crc kubenswrapper[4734]: I1205 23:24:12.589895 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 05 23:24:12 crc kubenswrapper[4734]: I1205 23:24:12.613797 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 05 23:24:12 crc kubenswrapper[4734]: I1205 23:24:12.640943 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 23:24:12 crc kubenswrapper[4734]: I1205 23:24:12.643751 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 23:24:12 crc kubenswrapper[4734]: I1205 23:24:12.677208 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 05 23:24:12 crc kubenswrapper[4734]: I1205 23:24:12.727056 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 05 23:24:12 crc kubenswrapper[4734]: I1205 23:24:12.740149 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 05 23:24:12 crc kubenswrapper[4734]: I1205 23:24:12.761842 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 05 23:24:12 crc kubenswrapper[4734]: I1205 23:24:12.885549 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 05 23:24:12 crc kubenswrapper[4734]: I1205 23:24:12.908511 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 05 23:24:13 crc kubenswrapper[4734]: I1205 23:24:13.189314 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 05 23:24:13 crc kubenswrapper[4734]: I1205 23:24:13.302404 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 05 23:24:13 crc kubenswrapper[4734]: I1205 23:24:13.324043 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 05 23:24:13 crc kubenswrapper[4734]: I1205 23:24:13.351537 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 05 23:24:13 crc kubenswrapper[4734]: I1205 23:24:13.381234 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 05 23:24:13 crc kubenswrapper[4734]: I1205 23:24:13.391718 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 05 23:24:13 crc kubenswrapper[4734]: I1205 23:24:13.446029 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 05 23:24:13 crc kubenswrapper[4734]: I1205 23:24:13.487300 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 05 23:24:13 crc kubenswrapper[4734]: I1205 23:24:13.637187 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 05 23:24:13 crc kubenswrapper[4734]: I1205 23:24:13.637365 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 05 23:24:13 crc kubenswrapper[4734]: I1205 23:24:13.729835 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 05 23:24:13 crc kubenswrapper[4734]: I1205 23:24:13.732708 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 05 23:24:13 crc kubenswrapper[4734]: I1205 23:24:13.789913 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 05 23:24:13 crc kubenswrapper[4734]: I1205 23:24:13.819208 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 05 23:24:13 crc kubenswrapper[4734]: I1205 23:24:13.849359 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 05 23:24:13 crc kubenswrapper[4734]: I1205 23:24:13.874879 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 05 23:24:14 crc kubenswrapper[4734]: I1205 23:24:14.061419 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 05 23:24:14 crc kubenswrapper[4734]: I1205 23:24:14.080750 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 05 23:24:14 crc kubenswrapper[4734]: I1205 23:24:14.106782 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 05 23:24:14 crc kubenswrapper[4734]: I1205 23:24:14.110852 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 05 23:24:14 crc kubenswrapper[4734]: E1205 23:24:14.127404 4734 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 05 23:24:14 crc kubenswrapper[4734]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7dc5844c99-8ln7q_openshift-authentication_6b9f465b-b5e7-49ed-8f1c-4710ba6c4175_0(0b23d87ac341bcf01c2c21eae2266f7a30dab70038bad8a99f3e8fb6c305eb4c): error adding pod openshift-authentication_oauth-openshift-7dc5844c99-8ln7q to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0b23d87ac341bcf01c2c21eae2266f7a30dab70038bad8a99f3e8fb6c305eb4c" Netns:"/var/run/netns/e7f2d728-20a2-4c88-a96f-cd14b70453ec" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7dc5844c99-8ln7q;K8S_POD_INFRA_CONTAINER_ID=0b23d87ac341bcf01c2c21eae2266f7a30dab70038bad8a99f3e8fb6c305eb4c;K8S_POD_UID=6b9f465b-b5e7-49ed-8f1c-4710ba6c4175" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7dc5844c99-8ln7q] networking: Multus: [openshift-authentication/oauth-openshift-7dc5844c99-8ln7q/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7dc5844c99-8ln7q in out of cluster comm: pod "oauth-openshift-7dc5844c99-8ln7q" not found Dec 05 23:24:14 crc kubenswrapper[4734]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 05 23:24:14 crc kubenswrapper[4734]: > Dec 05 23:24:14 crc kubenswrapper[4734]: E1205 23:24:14.127493 4734 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 05 23:24:14 crc kubenswrapper[4734]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7dc5844c99-8ln7q_openshift-authentication_6b9f465b-b5e7-49ed-8f1c-4710ba6c4175_0(0b23d87ac341bcf01c2c21eae2266f7a30dab70038bad8a99f3e8fb6c305eb4c): error adding pod openshift-authentication_oauth-openshift-7dc5844c99-8ln7q to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0b23d87ac341bcf01c2c21eae2266f7a30dab70038bad8a99f3e8fb6c305eb4c" Netns:"/var/run/netns/e7f2d728-20a2-4c88-a96f-cd14b70453ec" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7dc5844c99-8ln7q;K8S_POD_INFRA_CONTAINER_ID=0b23d87ac341bcf01c2c21eae2266f7a30dab70038bad8a99f3e8fb6c305eb4c;K8S_POD_UID=6b9f465b-b5e7-49ed-8f1c-4710ba6c4175" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7dc5844c99-8ln7q] networking: Multus: [openshift-authentication/oauth-openshift-7dc5844c99-8ln7q/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7dc5844c99-8ln7q in out of cluster comm: pod "oauth-openshift-7dc5844c99-8ln7q" not found Dec 05 23:24:14 crc kubenswrapper[4734]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 05 23:24:14 crc kubenswrapper[4734]: > pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:14 crc kubenswrapper[4734]: E1205 23:24:14.127518 4734 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 05 23:24:14 crc kubenswrapper[4734]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7dc5844c99-8ln7q_openshift-authentication_6b9f465b-b5e7-49ed-8f1c-4710ba6c4175_0(0b23d87ac341bcf01c2c21eae2266f7a30dab70038bad8a99f3e8fb6c305eb4c): error adding pod openshift-authentication_oauth-openshift-7dc5844c99-8ln7q to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0b23d87ac341bcf01c2c21eae2266f7a30dab70038bad8a99f3e8fb6c305eb4c" Netns:"/var/run/netns/e7f2d728-20a2-4c88-a96f-cd14b70453ec" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7dc5844c99-8ln7q;K8S_POD_INFRA_CONTAINER_ID=0b23d87ac341bcf01c2c21eae2266f7a30dab70038bad8a99f3e8fb6c305eb4c;K8S_POD_UID=6b9f465b-b5e7-49ed-8f1c-4710ba6c4175" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7dc5844c99-8ln7q] networking: Multus: [openshift-authentication/oauth-openshift-7dc5844c99-8ln7q/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7dc5844c99-8ln7q in out of cluster comm: pod "oauth-openshift-7dc5844c99-8ln7q" not found Dec 05 23:24:14 crc kubenswrapper[4734]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 05 23:24:14 crc kubenswrapper[4734]: > pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:14 crc kubenswrapper[4734]: E1205 23:24:14.127635 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-7dc5844c99-8ln7q_openshift-authentication(6b9f465b-b5e7-49ed-8f1c-4710ba6c4175)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-7dc5844c99-8ln7q_openshift-authentication(6b9f465b-b5e7-49ed-8f1c-4710ba6c4175)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7dc5844c99-8ln7q_openshift-authentication_6b9f465b-b5e7-49ed-8f1c-4710ba6c4175_0(0b23d87ac341bcf01c2c21eae2266f7a30dab70038bad8a99f3e8fb6c305eb4c): error adding pod openshift-authentication_oauth-openshift-7dc5844c99-8ln7q to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"0b23d87ac341bcf01c2c21eae2266f7a30dab70038bad8a99f3e8fb6c305eb4c\\\" Netns:\\\"/var/run/netns/e7f2d728-20a2-4c88-a96f-cd14b70453ec\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7dc5844c99-8ln7q;K8S_POD_INFRA_CONTAINER_ID=0b23d87ac341bcf01c2c21eae2266f7a30dab70038bad8a99f3e8fb6c305eb4c;K8S_POD_UID=6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7dc5844c99-8ln7q] networking: Multus: [openshift-authentication/oauth-openshift-7dc5844c99-8ln7q/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7dc5844c99-8ln7q in out of cluster comm: pod \\\"oauth-openshift-7dc5844c99-8ln7q\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" podUID="6b9f465b-b5e7-49ed-8f1c-4710ba6c4175" Dec 05 23:24:14 crc kubenswrapper[4734]: I1205 23:24:14.189956 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 05 23:24:14 crc kubenswrapper[4734]: I1205 23:24:14.193399 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 05 23:24:14 crc kubenswrapper[4734]: I1205 23:24:14.412268 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 05 23:24:14 crc kubenswrapper[4734]: I1205 23:24:14.412978 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 05 23:24:14 crc kubenswrapper[4734]: I1205 23:24:14.528669 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 05 23:24:14 crc kubenswrapper[4734]: I1205 23:24:14.545341 4734 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 23:24:14 crc kubenswrapper[4734]: I1205 23:24:14.545665 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://37f5547a1e7a533696a3421088bfced9cb82e86649446f6c81beda8303138121" gracePeriod=5 Dec 05 23:24:14 crc kubenswrapper[4734]: I1205 23:24:14.616891 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 05 23:24:14 crc kubenswrapper[4734]: I1205 23:24:14.722997 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 05 23:24:14 crc kubenswrapper[4734]: I1205 23:24:14.838230 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 05 23:24:15 crc kubenswrapper[4734]: I1205 23:24:15.017585 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 05 23:24:15 crc kubenswrapper[4734]: I1205 23:24:15.107380 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 23:24:15 crc kubenswrapper[4734]: I1205 23:24:15.107552 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 05 23:24:15 crc kubenswrapper[4734]: I1205 23:24:15.116705 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 05 23:24:15 crc kubenswrapper[4734]: I1205 23:24:15.120846 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 05 23:24:15 crc kubenswrapper[4734]: I1205 23:24:15.139913 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 05 23:24:15 crc kubenswrapper[4734]: I1205 23:24:15.194353 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 05 23:24:15 crc kubenswrapper[4734]: I1205 23:24:15.300029 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 05 23:24:15 crc kubenswrapper[4734]: I1205 23:24:15.371287 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 05 23:24:15 crc kubenswrapper[4734]: I1205 23:24:15.567461 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 05 23:24:15 crc kubenswrapper[4734]: I1205 23:24:15.728464 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 05 23:24:15 crc kubenswrapper[4734]: I1205 23:24:15.745477 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 05 23:24:15 crc kubenswrapper[4734]: I1205 23:24:15.844699 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 05 23:24:15 crc kubenswrapper[4734]: I1205 23:24:15.946784 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 05 23:24:15 crc kubenswrapper[4734]: I1205 23:24:15.955197 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 05 23:24:15 crc kubenswrapper[4734]: I1205 23:24:15.981798 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 05 23:24:16 crc kubenswrapper[4734]: I1205 23:24:16.108234 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 05 23:24:16 crc kubenswrapper[4734]: I1205 23:24:16.290203 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 05 23:24:16 crc kubenswrapper[4734]: I1205 23:24:16.312306 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 05 23:24:16 crc kubenswrapper[4734]: I1205 23:24:16.318725 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 05 23:24:16 crc kubenswrapper[4734]: I1205 23:24:16.462480 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 05 23:24:16 crc kubenswrapper[4734]: I1205 23:24:16.565781 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 05 23:24:16 crc kubenswrapper[4734]: I1205 23:24:16.814010 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 05 23:24:16 crc kubenswrapper[4734]: I1205 23:24:16.896947 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 05 23:24:16 crc kubenswrapper[4734]: I1205 23:24:16.908404 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 05 23:24:16 crc kubenswrapper[4734]: I1205 23:24:16.963455 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 05 23:24:17 crc kubenswrapper[4734]: I1205 23:24:17.072002 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 05 23:24:17 crc kubenswrapper[4734]: I1205 23:24:17.122793 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 05 23:24:17 crc kubenswrapper[4734]: I1205 23:24:17.217720 4734 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 05 23:24:17 crc kubenswrapper[4734]: I1205 23:24:17.291966 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 05 23:24:17 crc kubenswrapper[4734]: I1205 23:24:17.307253 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 05 23:24:17 crc kubenswrapper[4734]: I1205 23:24:17.340687 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 05 23:24:17 crc kubenswrapper[4734]: I1205 23:24:17.581385 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 05 23:24:17 crc kubenswrapper[4734]: I1205 23:24:17.865873 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 05 23:24:18 crc kubenswrapper[4734]: I1205 23:24:18.443337 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 05 23:24:18 crc kubenswrapper[4734]: I1205 23:24:18.493707 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 05 23:24:18 crc kubenswrapper[4734]: I1205 23:24:18.585216 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 05 23:24:18 crc kubenswrapper[4734]: I1205 23:24:18.843705 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 05 23:24:19 crc kubenswrapper[4734]: I1205 23:24:19.025542 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 23:24:20 crc kubenswrapper[4734]: I1205 23:24:20.120946 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 05 23:24:20 crc kubenswrapper[4734]: I1205 23:24:20.121380 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 23:24:20 crc kubenswrapper[4734]: I1205 23:24:20.252838 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 23:24:20 crc kubenswrapper[4734]: I1205 23:24:20.252948 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 23:24:20 crc kubenswrapper[4734]: I1205 23:24:20.253012 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 23:24:20 crc kubenswrapper[4734]: I1205 23:24:20.253043 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 23:24:20 crc kubenswrapper[4734]: I1205 23:24:20.253032 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:24:20 crc kubenswrapper[4734]: I1205 23:24:20.253071 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 23:24:20 crc kubenswrapper[4734]: I1205 23:24:20.253146 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:24:20 crc kubenswrapper[4734]: I1205 23:24:20.253147 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:24:20 crc kubenswrapper[4734]: I1205 23:24:20.253301 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:24:20 crc kubenswrapper[4734]: I1205 23:24:20.254105 4734 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 05 23:24:20 crc kubenswrapper[4734]: I1205 23:24:20.254148 4734 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 23:24:20 crc kubenswrapper[4734]: I1205 23:24:20.254167 4734 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 05 23:24:20 crc kubenswrapper[4734]: I1205 23:24:20.254183 4734 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 05 23:24:20 crc kubenswrapper[4734]: I1205 23:24:20.264936 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:24:20 crc kubenswrapper[4734]: I1205 23:24:20.356043 4734 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 23:24:20 crc kubenswrapper[4734]: I1205 23:24:20.521193 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 05 23:24:20 crc kubenswrapper[4734]: I1205 23:24:20.521256 4734 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="37f5547a1e7a533696a3421088bfced9cb82e86649446f6c81beda8303138121" exitCode=137 Dec 05 23:24:20 crc kubenswrapper[4734]: I1205 23:24:20.521339 4734 scope.go:117] "RemoveContainer" containerID="37f5547a1e7a533696a3421088bfced9cb82e86649446f6c81beda8303138121" Dec 05 23:24:20 crc kubenswrapper[4734]: I1205 23:24:20.521439 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 23:24:20 crc kubenswrapper[4734]: I1205 23:24:20.545600 4734 scope.go:117] "RemoveContainer" containerID="37f5547a1e7a533696a3421088bfced9cb82e86649446f6c81beda8303138121" Dec 05 23:24:20 crc kubenswrapper[4734]: E1205 23:24:20.546018 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37f5547a1e7a533696a3421088bfced9cb82e86649446f6c81beda8303138121\": container with ID starting with 37f5547a1e7a533696a3421088bfced9cb82e86649446f6c81beda8303138121 not found: ID does not exist" containerID="37f5547a1e7a533696a3421088bfced9cb82e86649446f6c81beda8303138121" Dec 05 23:24:20 crc kubenswrapper[4734]: I1205 23:24:20.546050 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37f5547a1e7a533696a3421088bfced9cb82e86649446f6c81beda8303138121"} err="failed to get container status \"37f5547a1e7a533696a3421088bfced9cb82e86649446f6c81beda8303138121\": rpc error: code = NotFound desc = could not find container \"37f5547a1e7a533696a3421088bfced9cb82e86649446f6c81beda8303138121\": container with ID starting with 37f5547a1e7a533696a3421088bfced9cb82e86649446f6c81beda8303138121 not found: ID does not exist" Dec 05 23:24:21 crc kubenswrapper[4734]: I1205 23:24:21.622025 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 05 23:24:21 crc kubenswrapper[4734]: I1205 23:24:21.622580 4734 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 05 23:24:21 crc kubenswrapper[4734]: I1205 23:24:21.640069 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 23:24:21 crc kubenswrapper[4734]: I1205 23:24:21.640141 4734 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="0b175d77-9f55-4a4d-896b-8d00a900c92c" Dec 05 23:24:21 crc kubenswrapper[4734]: I1205 23:24:21.647495 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 23:24:21 crc kubenswrapper[4734]: I1205 23:24:21.647568 4734 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="0b175d77-9f55-4a4d-896b-8d00a900c92c" Dec 05 23:24:27 crc kubenswrapper[4734]: I1205 23:24:27.613857 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:27 crc kubenswrapper[4734]: I1205 23:24:27.615337 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:30 crc kubenswrapper[4734]: E1205 23:24:30.487056 4734 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 05 23:24:30 crc kubenswrapper[4734]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7dc5844c99-8ln7q_openshift-authentication_6b9f465b-b5e7-49ed-8f1c-4710ba6c4175_0(cc9f7a2327c1e090bd17c4b84d61ff6d6b6cb221b51012ac51912cf6a6d236d4): error adding pod openshift-authentication_oauth-openshift-7dc5844c99-8ln7q to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"cc9f7a2327c1e090bd17c4b84d61ff6d6b6cb221b51012ac51912cf6a6d236d4" Netns:"/var/run/netns/4a572d12-04d6-4a6a-b67c-af56de56e1ed" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7dc5844c99-8ln7q;K8S_POD_INFRA_CONTAINER_ID=cc9f7a2327c1e090bd17c4b84d61ff6d6b6cb221b51012ac51912cf6a6d236d4;K8S_POD_UID=6b9f465b-b5e7-49ed-8f1c-4710ba6c4175" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7dc5844c99-8ln7q] networking: Multus: [openshift-authentication/oauth-openshift-7dc5844c99-8ln7q/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7dc5844c99-8ln7q in out of cluster comm: pod "oauth-openshift-7dc5844c99-8ln7q" not found Dec 05 23:24:30 crc kubenswrapper[4734]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 05 23:24:30 crc kubenswrapper[4734]: > Dec 05 23:24:30 crc kubenswrapper[4734]: E1205 23:24:30.487481 4734 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 05 23:24:30 crc kubenswrapper[4734]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7dc5844c99-8ln7q_openshift-authentication_6b9f465b-b5e7-49ed-8f1c-4710ba6c4175_0(cc9f7a2327c1e090bd17c4b84d61ff6d6b6cb221b51012ac51912cf6a6d236d4): error adding pod openshift-authentication_oauth-openshift-7dc5844c99-8ln7q to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"cc9f7a2327c1e090bd17c4b84d61ff6d6b6cb221b51012ac51912cf6a6d236d4" Netns:"/var/run/netns/4a572d12-04d6-4a6a-b67c-af56de56e1ed" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7dc5844c99-8ln7q;K8S_POD_INFRA_CONTAINER_ID=cc9f7a2327c1e090bd17c4b84d61ff6d6b6cb221b51012ac51912cf6a6d236d4;K8S_POD_UID=6b9f465b-b5e7-49ed-8f1c-4710ba6c4175" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7dc5844c99-8ln7q] networking: Multus: [openshift-authentication/oauth-openshift-7dc5844c99-8ln7q/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7dc5844c99-8ln7q in out of cluster comm: pod "oauth-openshift-7dc5844c99-8ln7q" not found Dec 05 23:24:30 crc kubenswrapper[4734]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 05 23:24:30 crc kubenswrapper[4734]: > pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:30 crc kubenswrapper[4734]: E1205 23:24:30.487510 4734 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 05 23:24:30 crc kubenswrapper[4734]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7dc5844c99-8ln7q_openshift-authentication_6b9f465b-b5e7-49ed-8f1c-4710ba6c4175_0(cc9f7a2327c1e090bd17c4b84d61ff6d6b6cb221b51012ac51912cf6a6d236d4): error adding pod openshift-authentication_oauth-openshift-7dc5844c99-8ln7q to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"cc9f7a2327c1e090bd17c4b84d61ff6d6b6cb221b51012ac51912cf6a6d236d4" Netns:"/var/run/netns/4a572d12-04d6-4a6a-b67c-af56de56e1ed" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7dc5844c99-8ln7q;K8S_POD_INFRA_CONTAINER_ID=cc9f7a2327c1e090bd17c4b84d61ff6d6b6cb221b51012ac51912cf6a6d236d4;K8S_POD_UID=6b9f465b-b5e7-49ed-8f1c-4710ba6c4175" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7dc5844c99-8ln7q] networking: Multus: [openshift-authentication/oauth-openshift-7dc5844c99-8ln7q/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7dc5844c99-8ln7q in out of cluster comm: pod "oauth-openshift-7dc5844c99-8ln7q" not found Dec 05 23:24:30 crc kubenswrapper[4734]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 05 23:24:30 crc kubenswrapper[4734]: > pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:30 crc kubenswrapper[4734]: E1205 23:24:30.487600 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-7dc5844c99-8ln7q_openshift-authentication(6b9f465b-b5e7-49ed-8f1c-4710ba6c4175)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-7dc5844c99-8ln7q_openshift-authentication(6b9f465b-b5e7-49ed-8f1c-4710ba6c4175)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-7dc5844c99-8ln7q_openshift-authentication_6b9f465b-b5e7-49ed-8f1c-4710ba6c4175_0(cc9f7a2327c1e090bd17c4b84d61ff6d6b6cb221b51012ac51912cf6a6d236d4): error adding pod openshift-authentication_oauth-openshift-7dc5844c99-8ln7q to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"cc9f7a2327c1e090bd17c4b84d61ff6d6b6cb221b51012ac51912cf6a6d236d4\\\" Netns:\\\"/var/run/netns/4a572d12-04d6-4a6a-b67c-af56de56e1ed\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-7dc5844c99-8ln7q;K8S_POD_INFRA_CONTAINER_ID=cc9f7a2327c1e090bd17c4b84d61ff6d6b6cb221b51012ac51912cf6a6d236d4;K8S_POD_UID=6b9f465b-b5e7-49ed-8f1c-4710ba6c4175\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-7dc5844c99-8ln7q] networking: Multus: [openshift-authentication/oauth-openshift-7dc5844c99-8ln7q/6b9f465b-b5e7-49ed-8f1c-4710ba6c4175]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-7dc5844c99-8ln7q in out of cluster comm: pod \\\"oauth-openshift-7dc5844c99-8ln7q\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" podUID="6b9f465b-b5e7-49ed-8f1c-4710ba6c4175" Dec 05 23:24:35 crc kubenswrapper[4734]: I1205 23:24:35.627934 4734 generic.go:334] "Generic (PLEG): container finished" podID="0cd8fc51-deec-410b-b2bb-4818c2f71230" containerID="62cf1aaf8f798a7c616550fba162e6d45a92efe82a855b1f8651d9c990978bfe" exitCode=0 Dec 05 23:24:35 crc kubenswrapper[4734]: I1205 23:24:35.628055 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ws6qt" event={"ID":"0cd8fc51-deec-410b-b2bb-4818c2f71230","Type":"ContainerDied","Data":"62cf1aaf8f798a7c616550fba162e6d45a92efe82a855b1f8651d9c990978bfe"} Dec 05 23:24:35 crc kubenswrapper[4734]: I1205 23:24:35.629227 4734 scope.go:117] "RemoveContainer" containerID="62cf1aaf8f798a7c616550fba162e6d45a92efe82a855b1f8651d9c990978bfe" Dec 05 23:24:36 crc kubenswrapper[4734]: I1205 23:24:36.636176 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ws6qt" event={"ID":"0cd8fc51-deec-410b-b2bb-4818c2f71230","Type":"ContainerStarted","Data":"06cc21fd5615317d75503b49b30fc48c2b6ea896d839b4dba4330f382f0b5f3f"} Dec 05 23:24:36 crc kubenswrapper[4734]: I1205 23:24:36.636962 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ws6qt" Dec 05 23:24:36 crc kubenswrapper[4734]: I1205 23:24:36.638249 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ws6qt" Dec 05 23:24:37 crc kubenswrapper[4734]: I1205 23:24:37.655151 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 05 23:24:37 crc kubenswrapper[4734]: I1205 23:24:37.657874 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 23:24:37 crc kubenswrapper[4734]: I1205 23:24:37.657944 4734 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="84c8b1cd7aeab417c9fc285bc82649da2e0052260590a1a89126a789ceb1d32f" exitCode=137 Dec 05 23:24:37 crc kubenswrapper[4734]: I1205 23:24:37.658086 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"84c8b1cd7aeab417c9fc285bc82649da2e0052260590a1a89126a789ceb1d32f"} Dec 05 23:24:37 crc kubenswrapper[4734]: I1205 23:24:37.658208 4734 scope.go:117] "RemoveContainer" containerID="c5181f04d7adfc610337f6ca52413fb0d6af757ed26f97f93a5aab8afc3bb0b7" Dec 05 23:24:38 crc kubenswrapper[4734]: I1205 23:24:38.669314 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 05 23:24:38 crc kubenswrapper[4734]: I1205 23:24:38.672632 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d8afa227fe4bb06c0901746e420b19b4eb439b52f9421ef2508597777cfa05d6"} Dec 05 23:24:43 crc kubenswrapper[4734]: I1205 23:24:43.613777 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:43 crc kubenswrapper[4734]: I1205 23:24:43.615124 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:43 crc kubenswrapper[4734]: I1205 23:24:43.840945 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7dc5844c99-8ln7q"] Dec 05 23:24:43 crc kubenswrapper[4734]: W1205 23:24:43.850309 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b9f465b_b5e7_49ed_8f1c_4710ba6c4175.slice/crio-dfbcc7e45867844196efb80762340c4b107eae553d5d00df505e4a90d8c2a44c WatchSource:0}: Error finding container dfbcc7e45867844196efb80762340c4b107eae553d5d00df505e4a90d8c2a44c: Status 404 returned error can't find the container with id dfbcc7e45867844196efb80762340c4b107eae553d5d00df505e4a90d8c2a44c Dec 05 23:24:44 crc kubenswrapper[4734]: I1205 23:24:44.714468 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" event={"ID":"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175","Type":"ContainerStarted","Data":"85aceb33a7588c5cc3216c8e72da89e84511c1cfe7bc3d9d3153e1f8f77e4c20"} Dec 05 23:24:44 crc kubenswrapper[4734]: I1205 23:24:44.715142 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" event={"ID":"6b9f465b-b5e7-49ed-8f1c-4710ba6c4175","Type":"ContainerStarted","Data":"dfbcc7e45867844196efb80762340c4b107eae553d5d00df505e4a90d8c2a44c"} Dec 05 23:24:44 crc kubenswrapper[4734]: I1205 23:24:44.715190 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:44 crc kubenswrapper[4734]: I1205 23:24:44.883923 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" Dec 05 23:24:44 crc kubenswrapper[4734]: I1205 23:24:44.908813 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7dc5844c99-8ln7q" podStartSLOduration=86.908795987 podStartE2EDuration="1m26.908795987s" podCreationTimestamp="2025-12-05 23:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:24:44.742982226 +0000 UTC m=+305.426386512" watchObservedRunningTime="2025-12-05 23:24:44.908795987 +0000 UTC m=+305.592200263" Dec 05 23:24:46 crc kubenswrapper[4734]: I1205 23:24:46.960027 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 23:24:46 crc kubenswrapper[4734]: I1205 23:24:46.964876 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 23:24:47 crc kubenswrapper[4734]: I1205 23:24:47.733682 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 23:24:47 crc kubenswrapper[4734]: I1205 23:24:47.738064 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 23:24:54 crc kubenswrapper[4734]: I1205 23:24:54.004208 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fsrfk"] Dec 05 23:24:54 crc kubenswrapper[4734]: I1205 23:24:54.005379 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-fsrfk" podUID="376d1200-a143-4f81-9399-41fdafd1f0b1" containerName="controller-manager" containerID="cri-o://e2286978561d0807de2e2e499ff921509cdd52a708d18a2c37e60899c32b92e8" gracePeriod=30 Dec 05 23:24:54 crc kubenswrapper[4734]: I1205 23:24:54.021574 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz"] Dec 05 23:24:54 crc kubenswrapper[4734]: I1205 23:24:54.021896 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz" podUID="b7ff8265-c71c-4b81-a8db-3b68a2118fd6" containerName="route-controller-manager" containerID="cri-o://086fca1fc44c3e329041c16f083699c2d523501db3f50afa0a070e5ef6b3b4c4" gracePeriod=30 Dec 05 23:24:54 crc kubenswrapper[4734]: I1205 23:24:54.825995 4734 generic.go:334] "Generic (PLEG): container finished" podID="376d1200-a143-4f81-9399-41fdafd1f0b1" containerID="e2286978561d0807de2e2e499ff921509cdd52a708d18a2c37e60899c32b92e8" exitCode=0 Dec 05 23:24:54 crc kubenswrapper[4734]: I1205 23:24:54.826142 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fsrfk" event={"ID":"376d1200-a143-4f81-9399-41fdafd1f0b1","Type":"ContainerDied","Data":"e2286978561d0807de2e2e499ff921509cdd52a708d18a2c37e60899c32b92e8"} Dec 05 23:24:54 crc kubenswrapper[4734]: I1205 23:24:54.830286 4734 generic.go:334] "Generic (PLEG): container finished" podID="b7ff8265-c71c-4b81-a8db-3b68a2118fd6" containerID="086fca1fc44c3e329041c16f083699c2d523501db3f50afa0a070e5ef6b3b4c4" exitCode=0 Dec 05 23:24:54 crc kubenswrapper[4734]: I1205 23:24:54.830348 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz" event={"ID":"b7ff8265-c71c-4b81-a8db-3b68a2118fd6","Type":"ContainerDied","Data":"086fca1fc44c3e329041c16f083699c2d523501db3f50afa0a070e5ef6b3b4c4"} Dec 05 23:24:54 crc kubenswrapper[4734]: I1205 23:24:54.971718 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fsrfk" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.004645 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8kkk\" (UniqueName: \"kubernetes.io/projected/376d1200-a143-4f81-9399-41fdafd1f0b1-kube-api-access-q8kkk\") pod \"376d1200-a143-4f81-9399-41fdafd1f0b1\" (UID: \"376d1200-a143-4f81-9399-41fdafd1f0b1\") " Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.004725 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/376d1200-a143-4f81-9399-41fdafd1f0b1-config\") pod \"376d1200-a143-4f81-9399-41fdafd1f0b1\" (UID: \"376d1200-a143-4f81-9399-41fdafd1f0b1\") " Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.004813 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/376d1200-a143-4f81-9399-41fdafd1f0b1-proxy-ca-bundles\") pod \"376d1200-a143-4f81-9399-41fdafd1f0b1\" (UID: \"376d1200-a143-4f81-9399-41fdafd1f0b1\") " Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.004857 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/376d1200-a143-4f81-9399-41fdafd1f0b1-serving-cert\") pod \"376d1200-a143-4f81-9399-41fdafd1f0b1\" (UID: \"376d1200-a143-4f81-9399-41fdafd1f0b1\") " Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.004891 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/376d1200-a143-4f81-9399-41fdafd1f0b1-client-ca\") pod \"376d1200-a143-4f81-9399-41fdafd1f0b1\" (UID: \"376d1200-a143-4f81-9399-41fdafd1f0b1\") " Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.005808 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/376d1200-a143-4f81-9399-41fdafd1f0b1-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "376d1200-a143-4f81-9399-41fdafd1f0b1" (UID: "376d1200-a143-4f81-9399-41fdafd1f0b1"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.006482 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/376d1200-a143-4f81-9399-41fdafd1f0b1-client-ca" (OuterVolumeSpecName: "client-ca") pod "376d1200-a143-4f81-9399-41fdafd1f0b1" (UID: "376d1200-a143-4f81-9399-41fdafd1f0b1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.007099 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/376d1200-a143-4f81-9399-41fdafd1f0b1-config" (OuterVolumeSpecName: "config") pod "376d1200-a143-4f81-9399-41fdafd1f0b1" (UID: "376d1200-a143-4f81-9399-41fdafd1f0b1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.023372 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/376d1200-a143-4f81-9399-41fdafd1f0b1-kube-api-access-q8kkk" (OuterVolumeSpecName: "kube-api-access-q8kkk") pod "376d1200-a143-4f81-9399-41fdafd1f0b1" (UID: "376d1200-a143-4f81-9399-41fdafd1f0b1"). InnerVolumeSpecName "kube-api-access-q8kkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.023789 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/376d1200-a143-4f81-9399-41fdafd1f0b1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "376d1200-a143-4f81-9399-41fdafd1f0b1" (UID: "376d1200-a143-4f81-9399-41fdafd1f0b1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.046452 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.106511 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7ff8265-c71c-4b81-a8db-3b68a2118fd6-config\") pod \"b7ff8265-c71c-4b81-a8db-3b68a2118fd6\" (UID: \"b7ff8265-c71c-4b81-a8db-3b68a2118fd6\") " Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.106686 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7ff8265-c71c-4b81-a8db-3b68a2118fd6-serving-cert\") pod \"b7ff8265-c71c-4b81-a8db-3b68a2118fd6\" (UID: \"b7ff8265-c71c-4b81-a8db-3b68a2118fd6\") " Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.106713 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b7ff8265-c71c-4b81-a8db-3b68a2118fd6-client-ca\") pod \"b7ff8265-c71c-4b81-a8db-3b68a2118fd6\" (UID: \"b7ff8265-c71c-4b81-a8db-3b68a2118fd6\") " Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.106753 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl6ld\" (UniqueName: \"kubernetes.io/projected/b7ff8265-c71c-4b81-a8db-3b68a2118fd6-kube-api-access-tl6ld\") pod \"b7ff8265-c71c-4b81-a8db-3b68a2118fd6\" (UID: \"b7ff8265-c71c-4b81-a8db-3b68a2118fd6\") " Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.107038 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8kkk\" (UniqueName: \"kubernetes.io/projected/376d1200-a143-4f81-9399-41fdafd1f0b1-kube-api-access-q8kkk\") on node \"crc\" DevicePath \"\"" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.107061 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/376d1200-a143-4f81-9399-41fdafd1f0b1-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.107073 4734 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/376d1200-a143-4f81-9399-41fdafd1f0b1-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.107085 4734 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/376d1200-a143-4f81-9399-41fdafd1f0b1-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.107096 4734 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/376d1200-a143-4f81-9399-41fdafd1f0b1-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.107867 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7ff8265-c71c-4b81-a8db-3b68a2118fd6-client-ca" (OuterVolumeSpecName: "client-ca") pod "b7ff8265-c71c-4b81-a8db-3b68a2118fd6" (UID: "b7ff8265-c71c-4b81-a8db-3b68a2118fd6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.107920 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7ff8265-c71c-4b81-a8db-3b68a2118fd6-config" (OuterVolumeSpecName: "config") pod "b7ff8265-c71c-4b81-a8db-3b68a2118fd6" (UID: "b7ff8265-c71c-4b81-a8db-3b68a2118fd6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.114606 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7ff8265-c71c-4b81-a8db-3b68a2118fd6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b7ff8265-c71c-4b81-a8db-3b68a2118fd6" (UID: "b7ff8265-c71c-4b81-a8db-3b68a2118fd6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.118654 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7ff8265-c71c-4b81-a8db-3b68a2118fd6-kube-api-access-tl6ld" (OuterVolumeSpecName: "kube-api-access-tl6ld") pod "b7ff8265-c71c-4b81-a8db-3b68a2118fd6" (UID: "b7ff8265-c71c-4b81-a8db-3b68a2118fd6"). InnerVolumeSpecName "kube-api-access-tl6ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.208960 4734 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7ff8265-c71c-4b81-a8db-3b68a2118fd6-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.209010 4734 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b7ff8265-c71c-4b81-a8db-3b68a2118fd6-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.209025 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl6ld\" (UniqueName: \"kubernetes.io/projected/b7ff8265-c71c-4b81-a8db-3b68a2118fd6-kube-api-access-tl6ld\") on node \"crc\" DevicePath \"\"" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.209041 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7ff8265-c71c-4b81-a8db-3b68a2118fd6-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.682709 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7d8d559c58-pjpvf"] Dec 05 23:24:55 crc kubenswrapper[4734]: E1205 23:24:55.682999 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="376d1200-a143-4f81-9399-41fdafd1f0b1" containerName="controller-manager" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.683019 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="376d1200-a143-4f81-9399-41fdafd1f0b1" containerName="controller-manager" Dec 05 23:24:55 crc kubenswrapper[4734]: E1205 23:24:55.683031 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7ff8265-c71c-4b81-a8db-3b68a2118fd6" containerName="route-controller-manager" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.683038 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7ff8265-c71c-4b81-a8db-3b68a2118fd6" containerName="route-controller-manager" Dec 05 23:24:55 crc kubenswrapper[4734]: E1205 23:24:55.683062 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.683074 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.683184 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.683197 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="376d1200-a143-4f81-9399-41fdafd1f0b1" containerName="controller-manager" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.683212 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7ff8265-c71c-4b81-a8db-3b68a2118fd6" containerName="route-controller-manager" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.683688 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d8d559c58-pjpvf" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.698575 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6"] Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.699664 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.706161 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6"] Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.715455 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2bc5562-088e-4df8-a866-e1dae67ee011-serving-cert\") pod \"route-controller-manager-5c9d9ff4b4-gkzt6\" (UID: \"e2bc5562-088e-4df8-a866-e1dae67ee011\") " pod="openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.715537 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e2bc5562-088e-4df8-a866-e1dae67ee011-client-ca\") pod \"route-controller-manager-5c9d9ff4b4-gkzt6\" (UID: \"e2bc5562-088e-4df8-a866-e1dae67ee011\") " pod="openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.715641 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2bc5562-088e-4df8-a866-e1dae67ee011-config\") pod \"route-controller-manager-5c9d9ff4b4-gkzt6\" (UID: \"e2bc5562-088e-4df8-a866-e1dae67ee011\") " pod="openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.715735 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/826af56b-5935-4d88-9ee6-9462c30cb589-config\") pod \"controller-manager-7d8d559c58-pjpvf\" (UID: \"826af56b-5935-4d88-9ee6-9462c30cb589\") " pod="openshift-controller-manager/controller-manager-7d8d559c58-pjpvf" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.715777 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/826af56b-5935-4d88-9ee6-9462c30cb589-proxy-ca-bundles\") pod \"controller-manager-7d8d559c58-pjpvf\" (UID: \"826af56b-5935-4d88-9ee6-9462c30cb589\") " pod="openshift-controller-manager/controller-manager-7d8d559c58-pjpvf" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.715870 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/826af56b-5935-4d88-9ee6-9462c30cb589-serving-cert\") pod \"controller-manager-7d8d559c58-pjpvf\" (UID: \"826af56b-5935-4d88-9ee6-9462c30cb589\") " pod="openshift-controller-manager/controller-manager-7d8d559c58-pjpvf" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.715909 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt8j8\" (UniqueName: \"kubernetes.io/projected/e2bc5562-088e-4df8-a866-e1dae67ee011-kube-api-access-xt8j8\") pod \"route-controller-manager-5c9d9ff4b4-gkzt6\" (UID: \"e2bc5562-088e-4df8-a866-e1dae67ee011\") " pod="openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.715934 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62xhw\" (UniqueName: \"kubernetes.io/projected/826af56b-5935-4d88-9ee6-9462c30cb589-kube-api-access-62xhw\") pod \"controller-manager-7d8d559c58-pjpvf\" (UID: \"826af56b-5935-4d88-9ee6-9462c30cb589\") " pod="openshift-controller-manager/controller-manager-7d8d559c58-pjpvf" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.716120 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/826af56b-5935-4d88-9ee6-9462c30cb589-client-ca\") pod \"controller-manager-7d8d559c58-pjpvf\" (UID: \"826af56b-5935-4d88-9ee6-9462c30cb589\") " pod="openshift-controller-manager/controller-manager-7d8d559c58-pjpvf" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.721586 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7d8d559c58-pjpvf"] Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.817930 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/826af56b-5935-4d88-9ee6-9462c30cb589-client-ca\") pod \"controller-manager-7d8d559c58-pjpvf\" (UID: \"826af56b-5935-4d88-9ee6-9462c30cb589\") " pod="openshift-controller-manager/controller-manager-7d8d559c58-pjpvf" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.818008 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2bc5562-088e-4df8-a866-e1dae67ee011-serving-cert\") pod \"route-controller-manager-5c9d9ff4b4-gkzt6\" (UID: \"e2bc5562-088e-4df8-a866-e1dae67ee011\") " pod="openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.818033 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e2bc5562-088e-4df8-a866-e1dae67ee011-client-ca\") pod \"route-controller-manager-5c9d9ff4b4-gkzt6\" (UID: \"e2bc5562-088e-4df8-a866-e1dae67ee011\") " pod="openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.818058 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2bc5562-088e-4df8-a866-e1dae67ee011-config\") pod \"route-controller-manager-5c9d9ff4b4-gkzt6\" (UID: \"e2bc5562-088e-4df8-a866-e1dae67ee011\") " pod="openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.818079 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/826af56b-5935-4d88-9ee6-9462c30cb589-config\") pod \"controller-manager-7d8d559c58-pjpvf\" (UID: \"826af56b-5935-4d88-9ee6-9462c30cb589\") " pod="openshift-controller-manager/controller-manager-7d8d559c58-pjpvf" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.818098 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/826af56b-5935-4d88-9ee6-9462c30cb589-proxy-ca-bundles\") pod \"controller-manager-7d8d559c58-pjpvf\" (UID: \"826af56b-5935-4d88-9ee6-9462c30cb589\") " pod="openshift-controller-manager/controller-manager-7d8d559c58-pjpvf" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.818123 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/826af56b-5935-4d88-9ee6-9462c30cb589-serving-cert\") pod \"controller-manager-7d8d559c58-pjpvf\" (UID: \"826af56b-5935-4d88-9ee6-9462c30cb589\") " pod="openshift-controller-manager/controller-manager-7d8d559c58-pjpvf" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.818142 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62xhw\" (UniqueName: \"kubernetes.io/projected/826af56b-5935-4d88-9ee6-9462c30cb589-kube-api-access-62xhw\") pod \"controller-manager-7d8d559c58-pjpvf\" (UID: \"826af56b-5935-4d88-9ee6-9462c30cb589\") " pod="openshift-controller-manager/controller-manager-7d8d559c58-pjpvf" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.818160 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt8j8\" (UniqueName: \"kubernetes.io/projected/e2bc5562-088e-4df8-a866-e1dae67ee011-kube-api-access-xt8j8\") pod \"route-controller-manager-5c9d9ff4b4-gkzt6\" (UID: \"e2bc5562-088e-4df8-a866-e1dae67ee011\") " pod="openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.819793 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/826af56b-5935-4d88-9ee6-9462c30cb589-client-ca\") pod \"controller-manager-7d8d559c58-pjpvf\" (UID: \"826af56b-5935-4d88-9ee6-9462c30cb589\") " pod="openshift-controller-manager/controller-manager-7d8d559c58-pjpvf" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.819831 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e2bc5562-088e-4df8-a866-e1dae67ee011-client-ca\") pod \"route-controller-manager-5c9d9ff4b4-gkzt6\" (UID: \"e2bc5562-088e-4df8-a866-e1dae67ee011\") " pod="openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.820350 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/826af56b-5935-4d88-9ee6-9462c30cb589-config\") pod \"controller-manager-7d8d559c58-pjpvf\" (UID: \"826af56b-5935-4d88-9ee6-9462c30cb589\") " pod="openshift-controller-manager/controller-manager-7d8d559c58-pjpvf" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.820396 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2bc5562-088e-4df8-a866-e1dae67ee011-config\") pod \"route-controller-manager-5c9d9ff4b4-gkzt6\" (UID: \"e2bc5562-088e-4df8-a866-e1dae67ee011\") " pod="openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.820592 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/826af56b-5935-4d88-9ee6-9462c30cb589-proxy-ca-bundles\") pod \"controller-manager-7d8d559c58-pjpvf\" (UID: \"826af56b-5935-4d88-9ee6-9462c30cb589\") " pod="openshift-controller-manager/controller-manager-7d8d559c58-pjpvf" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.827352 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/826af56b-5935-4d88-9ee6-9462c30cb589-serving-cert\") pod \"controller-manager-7d8d559c58-pjpvf\" (UID: \"826af56b-5935-4d88-9ee6-9462c30cb589\") " pod="openshift-controller-manager/controller-manager-7d8d559c58-pjpvf" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.835336 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2bc5562-088e-4df8-a866-e1dae67ee011-serving-cert\") pod \"route-controller-manager-5c9d9ff4b4-gkzt6\" (UID: \"e2bc5562-088e-4df8-a866-e1dae67ee011\") " pod="openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.838204 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62xhw\" (UniqueName: \"kubernetes.io/projected/826af56b-5935-4d88-9ee6-9462c30cb589-kube-api-access-62xhw\") pod \"controller-manager-7d8d559c58-pjpvf\" (UID: \"826af56b-5935-4d88-9ee6-9462c30cb589\") " pod="openshift-controller-manager/controller-manager-7d8d559c58-pjpvf" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.838950 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fsrfk" event={"ID":"376d1200-a143-4f81-9399-41fdafd1f0b1","Type":"ContainerDied","Data":"60d67dc6b02aee9af46d24e4520f72f53517f8d5a6954d4f99fb039861c974b0"} Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.839122 4734 scope.go:117] "RemoveContainer" containerID="e2286978561d0807de2e2e499ff921509cdd52a708d18a2c37e60899c32b92e8" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.839360 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fsrfk" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.842164 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz" event={"ID":"b7ff8265-c71c-4b81-a8db-3b68a2118fd6","Type":"ContainerDied","Data":"86164c7afe8662c7faf0f731867ecf5fdb67d8fdc307b1bd32a896b65f6bc2ce"} Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.842248 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.845223 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt8j8\" (UniqueName: \"kubernetes.io/projected/e2bc5562-088e-4df8-a866-e1dae67ee011-kube-api-access-xt8j8\") pod \"route-controller-manager-5c9d9ff4b4-gkzt6\" (UID: \"e2bc5562-088e-4df8-a866-e1dae67ee011\") " pod="openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.899259 4734 scope.go:117] "RemoveContainer" containerID="086fca1fc44c3e329041c16f083699c2d523501db3f50afa0a070e5ef6b3b4c4" Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.913597 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fsrfk"] Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.919798 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fsrfk"] Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.937383 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz"] Dec 05 23:24:55 crc kubenswrapper[4734]: I1205 23:24:55.957408 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-x5rwz"] Dec 05 23:24:56 crc kubenswrapper[4734]: I1205 23:24:56.012122 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d8d559c58-pjpvf" Dec 05 23:24:56 crc kubenswrapper[4734]: I1205 23:24:56.029402 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6" Dec 05 23:24:56 crc kubenswrapper[4734]: I1205 23:24:56.377829 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6"] Dec 05 23:24:56 crc kubenswrapper[4734]: I1205 23:24:56.524815 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7d8d559c58-pjpvf"] Dec 05 23:24:56 crc kubenswrapper[4734]: I1205 23:24:56.854275 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d8d559c58-pjpvf" event={"ID":"826af56b-5935-4d88-9ee6-9462c30cb589","Type":"ContainerStarted","Data":"58c4dacccc89a564f0c64b58dab3c9e51bd17e297fc45e0aa3ba39453bdcc307"} Dec 05 23:24:56 crc kubenswrapper[4734]: I1205 23:24:56.854746 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d8d559c58-pjpvf" event={"ID":"826af56b-5935-4d88-9ee6-9462c30cb589","Type":"ContainerStarted","Data":"9efc5683e777ea508fa2e784120f5e5e231e9a3fa8ad2c025ce9949e811bbfc5"} Dec 05 23:24:56 crc kubenswrapper[4734]: I1205 23:24:56.854767 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7d8d559c58-pjpvf" Dec 05 23:24:56 crc kubenswrapper[4734]: I1205 23:24:56.859091 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6" event={"ID":"e2bc5562-088e-4df8-a866-e1dae67ee011","Type":"ContainerStarted","Data":"6d9089283595447afb3e2770c69e97a3f4ed92dd6e4b15de05dba1c69bf0f8b7"} Dec 05 23:24:56 crc kubenswrapper[4734]: I1205 23:24:56.859130 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6" event={"ID":"e2bc5562-088e-4df8-a866-e1dae67ee011","Type":"ContainerStarted","Data":"aa5d222bd84db7ed2940eab75c1addf7b2b0d71885847f891bb3bd9f3750b5d2"} Dec 05 23:24:56 crc kubenswrapper[4734]: I1205 23:24:56.859380 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6" Dec 05 23:24:56 crc kubenswrapper[4734]: I1205 23:24:56.861929 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7d8d559c58-pjpvf" Dec 05 23:24:56 crc kubenswrapper[4734]: I1205 23:24:56.877761 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7d8d559c58-pjpvf" podStartSLOduration=2.87774064 podStartE2EDuration="2.87774064s" podCreationTimestamp="2025-12-05 23:24:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:24:56.874471659 +0000 UTC m=+317.557875935" watchObservedRunningTime="2025-12-05 23:24:56.87774064 +0000 UTC m=+317.561144916" Dec 05 23:24:56 crc kubenswrapper[4734]: I1205 23:24:56.926243 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6" podStartSLOduration=2.926220724 podStartE2EDuration="2.926220724s" podCreationTimestamp="2025-12-05 23:24:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:24:56.921545589 +0000 UTC m=+317.604949865" watchObservedRunningTime="2025-12-05 23:24:56.926220724 +0000 UTC m=+317.609625000" Dec 05 23:24:57 crc kubenswrapper[4734]: I1205 23:24:57.413974 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6" Dec 05 23:24:57 crc kubenswrapper[4734]: I1205 23:24:57.620599 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="376d1200-a143-4f81-9399-41fdafd1f0b1" path="/var/lib/kubelet/pods/376d1200-a143-4f81-9399-41fdafd1f0b1/volumes" Dec 05 23:24:57 crc kubenswrapper[4734]: I1205 23:24:57.621337 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7ff8265-c71c-4b81-a8db-3b68a2118fd6" path="/var/lib/kubelet/pods/b7ff8265-c71c-4b81-a8db-3b68a2118fd6/volumes" Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.597268 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hmv47"] Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.598471 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hmv47" podUID="7e24c08e-fb74-4ae6-9c48-ae9653c964e8" containerName="registry-server" containerID="cri-o://c893a3f02e495017bbd5ae00c480a8729469b43d7ab234724e209dc00e833de8" gracePeriod=30 Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.605655 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cb4rj"] Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.606017 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cb4rj" podUID="597348be-fe32-4495-bb10-d152ed593e3e" containerName="registry-server" containerID="cri-o://8f3a5e56a02fbdb8a0a5a8736fd80d28d5c42b5deb684663ff6910b3b0b752bd" gracePeriod=30 Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.667087 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ws6qt"] Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.667154 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kdt5b"] Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.667169 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gnbkp"] Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.667535 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-ws6qt" podUID="0cd8fc51-deec-410b-b2bb-4818c2f71230" containerName="marketplace-operator" containerID="cri-o://06cc21fd5615317d75503b49b30fc48c2b6ea896d839b4dba4330f382f0b5f3f" gracePeriod=30 Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.668257 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gnbkp" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" containerName="registry-server" containerID="cri-o://0ad97f0b80b9db497c44d78afb09b9217c04438574d4c9f552853398eddc04ec" gracePeriod=30 Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.673481 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kdt5b" podUID="5040a4a1-0b01-4581-89a7-37186c3caebe" containerName="registry-server" containerID="cri-o://8c612793464e454a21b04c406c2a3457cc0a807a995eba7d6716550f916c9b8b" gracePeriod=30 Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.680794 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zddtm"] Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.681779 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zddtm" Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.686131 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vj45t"] Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.686431 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vj45t" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" containerName="registry-server" containerID="cri-o://5b110ebd20ad70373883a8604089e967c00f134bcab314c5426fd46a1753c413" gracePeriod=30 Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.691613 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zddtm"] Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.785816 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf8kd\" (UniqueName: \"kubernetes.io/projected/8cccd0a8-35c0-4e22-b73c-bc9282c804b6-kube-api-access-mf8kd\") pod \"marketplace-operator-79b997595-zddtm\" (UID: \"8cccd0a8-35c0-4e22-b73c-bc9282c804b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-zddtm" Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.785868 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8cccd0a8-35c0-4e22-b73c-bc9282c804b6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zddtm\" (UID: \"8cccd0a8-35c0-4e22-b73c-bc9282c804b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-zddtm" Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.785907 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8cccd0a8-35c0-4e22-b73c-bc9282c804b6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zddtm\" (UID: \"8cccd0a8-35c0-4e22-b73c-bc9282c804b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-zddtm" Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.886747 4734 generic.go:334] "Generic (PLEG): container finished" podID="597348be-fe32-4495-bb10-d152ed593e3e" containerID="8f3a5e56a02fbdb8a0a5a8736fd80d28d5c42b5deb684663ff6910b3b0b752bd" exitCode=0 Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.886787 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8cccd0a8-35c0-4e22-b73c-bc9282c804b6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zddtm\" (UID: \"8cccd0a8-35c0-4e22-b73c-bc9282c804b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-zddtm" Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.886849 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cb4rj" event={"ID":"597348be-fe32-4495-bb10-d152ed593e3e","Type":"ContainerDied","Data":"8f3a5e56a02fbdb8a0a5a8736fd80d28d5c42b5deb684663ff6910b3b0b752bd"} Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.886885 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf8kd\" (UniqueName: \"kubernetes.io/projected/8cccd0a8-35c0-4e22-b73c-bc9282c804b6-kube-api-access-mf8kd\") pod \"marketplace-operator-79b997595-zddtm\" (UID: \"8cccd0a8-35c0-4e22-b73c-bc9282c804b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-zddtm" Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.886909 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8cccd0a8-35c0-4e22-b73c-bc9282c804b6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zddtm\" (UID: \"8cccd0a8-35c0-4e22-b73c-bc9282c804b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-zddtm" Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.888230 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8cccd0a8-35c0-4e22-b73c-bc9282c804b6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zddtm\" (UID: \"8cccd0a8-35c0-4e22-b73c-bc9282c804b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-zddtm" Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.896623 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8cccd0a8-35c0-4e22-b73c-bc9282c804b6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zddtm\" (UID: \"8cccd0a8-35c0-4e22-b73c-bc9282c804b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-zddtm" Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.911594 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf8kd\" (UniqueName: \"kubernetes.io/projected/8cccd0a8-35c0-4e22-b73c-bc9282c804b6-kube-api-access-mf8kd\") pod \"marketplace-operator-79b997595-zddtm\" (UID: \"8cccd0a8-35c0-4e22-b73c-bc9282c804b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-zddtm" Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.912853 4734 generic.go:334] "Generic (PLEG): container finished" podID="7e24c08e-fb74-4ae6-9c48-ae9653c964e8" containerID="c893a3f02e495017bbd5ae00c480a8729469b43d7ab234724e209dc00e833de8" exitCode=0 Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.912995 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmv47" event={"ID":"7e24c08e-fb74-4ae6-9c48-ae9653c964e8","Type":"ContainerDied","Data":"c893a3f02e495017bbd5ae00c480a8729469b43d7ab234724e209dc00e833de8"} Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.923562 4734 generic.go:334] "Generic (PLEG): container finished" podID="0cd8fc51-deec-410b-b2bb-4818c2f71230" containerID="06cc21fd5615317d75503b49b30fc48c2b6ea896d839b4dba4330f382f0b5f3f" exitCode=0 Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.923782 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ws6qt" event={"ID":"0cd8fc51-deec-410b-b2bb-4818c2f71230","Type":"ContainerDied","Data":"06cc21fd5615317d75503b49b30fc48c2b6ea896d839b4dba4330f382f0b5f3f"} Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.924748 4734 scope.go:117] "RemoveContainer" containerID="62cf1aaf8f798a7c616550fba162e6d45a92efe82a855b1f8651d9c990978bfe" Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.932871 4734 generic.go:334] "Generic (PLEG): container finished" podID="5040a4a1-0b01-4581-89a7-37186c3caebe" containerID="8c612793464e454a21b04c406c2a3457cc0a807a995eba7d6716550f916c9b8b" exitCode=0 Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.932957 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kdt5b" event={"ID":"5040a4a1-0b01-4581-89a7-37186c3caebe","Type":"ContainerDied","Data":"8c612793464e454a21b04c406c2a3457cc0a807a995eba7d6716550f916c9b8b"} Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.934983 4734 generic.go:334] "Generic (PLEG): container finished" podID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" containerID="0ad97f0b80b9db497c44d78afb09b9217c04438574d4c9f552853398eddc04ec" exitCode=0 Dec 05 23:24:59 crc kubenswrapper[4734]: I1205 23:24:59.935011 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gnbkp" event={"ID":"4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4","Type":"ContainerDied","Data":"0ad97f0b80b9db497c44d78afb09b9217c04438574d4c9f552853398eddc04ec"} Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.117897 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zddtm" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.125039 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hmv47" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.191931 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e24c08e-fb74-4ae6-9c48-ae9653c964e8-catalog-content\") pod \"7e24c08e-fb74-4ae6-9c48-ae9653c964e8\" (UID: \"7e24c08e-fb74-4ae6-9c48-ae9653c964e8\") " Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.192040 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grpqv\" (UniqueName: \"kubernetes.io/projected/7e24c08e-fb74-4ae6-9c48-ae9653c964e8-kube-api-access-grpqv\") pod \"7e24c08e-fb74-4ae6-9c48-ae9653c964e8\" (UID: \"7e24c08e-fb74-4ae6-9c48-ae9653c964e8\") " Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.192211 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e24c08e-fb74-4ae6-9c48-ae9653c964e8-utilities\") pod \"7e24c08e-fb74-4ae6-9c48-ae9653c964e8\" (UID: \"7e24c08e-fb74-4ae6-9c48-ae9653c964e8\") " Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.193601 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e24c08e-fb74-4ae6-9c48-ae9653c964e8-utilities" (OuterVolumeSpecName: "utilities") pod "7e24c08e-fb74-4ae6-9c48-ae9653c964e8" (UID: "7e24c08e-fb74-4ae6-9c48-ae9653c964e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.193745 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e24c08e-fb74-4ae6-9c48-ae9653c964e8-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.198370 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e24c08e-fb74-4ae6-9c48-ae9653c964e8-kube-api-access-grpqv" (OuterVolumeSpecName: "kube-api-access-grpqv") pod "7e24c08e-fb74-4ae6-9c48-ae9653c964e8" (UID: "7e24c08e-fb74-4ae6-9c48-ae9653c964e8"). InnerVolumeSpecName "kube-api-access-grpqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.211336 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cb4rj" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.266099 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e24c08e-fb74-4ae6-9c48-ae9653c964e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e24c08e-fb74-4ae6-9c48-ae9653c964e8" (UID: "7e24c08e-fb74-4ae6-9c48-ae9653c964e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.295922 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/597348be-fe32-4495-bb10-d152ed593e3e-utilities\") pod \"597348be-fe32-4495-bb10-d152ed593e3e\" (UID: \"597348be-fe32-4495-bb10-d152ed593e3e\") " Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.296644 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bsvp\" (UniqueName: \"kubernetes.io/projected/597348be-fe32-4495-bb10-d152ed593e3e-kube-api-access-8bsvp\") pod \"597348be-fe32-4495-bb10-d152ed593e3e\" (UID: \"597348be-fe32-4495-bb10-d152ed593e3e\") " Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.296744 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/597348be-fe32-4495-bb10-d152ed593e3e-catalog-content\") pod \"597348be-fe32-4495-bb10-d152ed593e3e\" (UID: \"597348be-fe32-4495-bb10-d152ed593e3e\") " Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.297237 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e24c08e-fb74-4ae6-9c48-ae9653c964e8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.297263 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grpqv\" (UniqueName: \"kubernetes.io/projected/7e24c08e-fb74-4ae6-9c48-ae9653c964e8-kube-api-access-grpqv\") on node \"crc\" DevicePath \"\"" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.299841 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/597348be-fe32-4495-bb10-d152ed593e3e-utilities" (OuterVolumeSpecName: "utilities") pod "597348be-fe32-4495-bb10-d152ed593e3e" (UID: "597348be-fe32-4495-bb10-d152ed593e3e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.308048 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/597348be-fe32-4495-bb10-d152ed593e3e-kube-api-access-8bsvp" (OuterVolumeSpecName: "kube-api-access-8bsvp") pod "597348be-fe32-4495-bb10-d152ed593e3e" (UID: "597348be-fe32-4495-bb10-d152ed593e3e"). InnerVolumeSpecName "kube-api-access-8bsvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.356348 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/597348be-fe32-4495-bb10-d152ed593e3e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "597348be-fe32-4495-bb10-d152ed593e3e" (UID: "597348be-fe32-4495-bb10-d152ed593e3e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.398624 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/597348be-fe32-4495-bb10-d152ed593e3e-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.398679 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bsvp\" (UniqueName: \"kubernetes.io/projected/597348be-fe32-4495-bb10-d152ed593e3e-kube-api-access-8bsvp\") on node \"crc\" DevicePath \"\"" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.398702 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/597348be-fe32-4495-bb10-d152ed593e3e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.416456 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ws6qt" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.424336 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kdt5b" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.454043 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gnbkp" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.476634 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vj45t" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.499724 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf8jt\" (UniqueName: \"kubernetes.io/projected/4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4-kube-api-access-bf8jt\") pod \"4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4\" (UID: \"4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4\") " Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.499785 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5bcz\" (UniqueName: \"kubernetes.io/projected/5040a4a1-0b01-4581-89a7-37186c3caebe-kube-api-access-t5bcz\") pod \"5040a4a1-0b01-4581-89a7-37186c3caebe\" (UID: \"5040a4a1-0b01-4581-89a7-37186c3caebe\") " Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.499821 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m98bg\" (UniqueName: \"kubernetes.io/projected/0cd8fc51-deec-410b-b2bb-4818c2f71230-kube-api-access-m98bg\") pod \"0cd8fc51-deec-410b-b2bb-4818c2f71230\" (UID: \"0cd8fc51-deec-410b-b2bb-4818c2f71230\") " Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.499850 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4-catalog-content\") pod \"4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4\" (UID: \"4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4\") " Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.499898 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0cd8fc51-deec-410b-b2bb-4818c2f71230-marketplace-operator-metrics\") pod \"0cd8fc51-deec-410b-b2bb-4818c2f71230\" (UID: \"0cd8fc51-deec-410b-b2bb-4818c2f71230\") " Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.499953 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5040a4a1-0b01-4581-89a7-37186c3caebe-catalog-content\") pod \"5040a4a1-0b01-4581-89a7-37186c3caebe\" (UID: \"5040a4a1-0b01-4581-89a7-37186c3caebe\") " Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.499976 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgxsd\" (UniqueName: \"kubernetes.io/projected/7ba0c803-1b80-4161-afa1-c9b6dc65ea00-kube-api-access-tgxsd\") pod \"7ba0c803-1b80-4161-afa1-c9b6dc65ea00\" (UID: \"7ba0c803-1b80-4161-afa1-c9b6dc65ea00\") " Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.500020 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4-utilities\") pod \"4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4\" (UID: \"4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4\") " Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.500041 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0cd8fc51-deec-410b-b2bb-4818c2f71230-marketplace-trusted-ca\") pod \"0cd8fc51-deec-410b-b2bb-4818c2f71230\" (UID: \"0cd8fc51-deec-410b-b2bb-4818c2f71230\") " Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.500062 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ba0c803-1b80-4161-afa1-c9b6dc65ea00-utilities\") pod \"7ba0c803-1b80-4161-afa1-c9b6dc65ea00\" (UID: \"7ba0c803-1b80-4161-afa1-c9b6dc65ea00\") " Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.500106 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5040a4a1-0b01-4581-89a7-37186c3caebe-utilities\") pod \"5040a4a1-0b01-4581-89a7-37186c3caebe\" (UID: \"5040a4a1-0b01-4581-89a7-37186c3caebe\") " Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.500133 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ba0c803-1b80-4161-afa1-c9b6dc65ea00-catalog-content\") pod \"7ba0c803-1b80-4161-afa1-c9b6dc65ea00\" (UID: \"7ba0c803-1b80-4161-afa1-c9b6dc65ea00\") " Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.502429 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4-utilities" (OuterVolumeSpecName: "utilities") pod "4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" (UID: "4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.503007 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ba0c803-1b80-4161-afa1-c9b6dc65ea00-utilities" (OuterVolumeSpecName: "utilities") pod "7ba0c803-1b80-4161-afa1-c9b6dc65ea00" (UID: "7ba0c803-1b80-4161-afa1-c9b6dc65ea00"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.503492 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cd8fc51-deec-410b-b2bb-4818c2f71230-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "0cd8fc51-deec-410b-b2bb-4818c2f71230" (UID: "0cd8fc51-deec-410b-b2bb-4818c2f71230"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.504027 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5040a4a1-0b01-4581-89a7-37186c3caebe-utilities" (OuterVolumeSpecName: "utilities") pod "5040a4a1-0b01-4581-89a7-37186c3caebe" (UID: "5040a4a1-0b01-4581-89a7-37186c3caebe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.508030 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cd8fc51-deec-410b-b2bb-4818c2f71230-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "0cd8fc51-deec-410b-b2bb-4818c2f71230" (UID: "0cd8fc51-deec-410b-b2bb-4818c2f71230"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.508641 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5040a4a1-0b01-4581-89a7-37186c3caebe-kube-api-access-t5bcz" (OuterVolumeSpecName: "kube-api-access-t5bcz") pod "5040a4a1-0b01-4581-89a7-37186c3caebe" (UID: "5040a4a1-0b01-4581-89a7-37186c3caebe"). InnerVolumeSpecName "kube-api-access-t5bcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.509812 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ba0c803-1b80-4161-afa1-c9b6dc65ea00-kube-api-access-tgxsd" (OuterVolumeSpecName: "kube-api-access-tgxsd") pod "7ba0c803-1b80-4161-afa1-c9b6dc65ea00" (UID: "7ba0c803-1b80-4161-afa1-c9b6dc65ea00"). InnerVolumeSpecName "kube-api-access-tgxsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.510430 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4-kube-api-access-bf8jt" (OuterVolumeSpecName: "kube-api-access-bf8jt") pod "4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" (UID: "4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4"). InnerVolumeSpecName "kube-api-access-bf8jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.511905 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cd8fc51-deec-410b-b2bb-4818c2f71230-kube-api-access-m98bg" (OuterVolumeSpecName: "kube-api-access-m98bg") pod "0cd8fc51-deec-410b-b2bb-4818c2f71230" (UID: "0cd8fc51-deec-410b-b2bb-4818c2f71230"). InnerVolumeSpecName "kube-api-access-m98bg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.525245 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5040a4a1-0b01-4581-89a7-37186c3caebe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5040a4a1-0b01-4581-89a7-37186c3caebe" (UID: "5040a4a1-0b01-4581-89a7-37186c3caebe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.601603 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf8jt\" (UniqueName: \"kubernetes.io/projected/4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4-kube-api-access-bf8jt\") on node \"crc\" DevicePath \"\"" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.601653 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5bcz\" (UniqueName: \"kubernetes.io/projected/5040a4a1-0b01-4581-89a7-37186c3caebe-kube-api-access-t5bcz\") on node \"crc\" DevicePath \"\"" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.601668 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m98bg\" (UniqueName: \"kubernetes.io/projected/0cd8fc51-deec-410b-b2bb-4818c2f71230-kube-api-access-m98bg\") on node \"crc\" DevicePath \"\"" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.601719 4734 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0cd8fc51-deec-410b-b2bb-4818c2f71230-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.601739 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5040a4a1-0b01-4581-89a7-37186c3caebe-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.601758 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgxsd\" (UniqueName: \"kubernetes.io/projected/7ba0c803-1b80-4161-afa1-c9b6dc65ea00-kube-api-access-tgxsd\") on node \"crc\" DevicePath \"\"" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.601773 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.601788 4734 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0cd8fc51-deec-410b-b2bb-4818c2f71230-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.601801 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ba0c803-1b80-4161-afa1-c9b6dc65ea00-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.601813 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5040a4a1-0b01-4581-89a7-37186c3caebe-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.624879 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" (UID: "4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.638508 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ba0c803-1b80-4161-afa1-c9b6dc65ea00-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ba0c803-1b80-4161-afa1-c9b6dc65ea00" (UID: "7ba0c803-1b80-4161-afa1-c9b6dc65ea00"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.702998 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.703037 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ba0c803-1b80-4161-afa1-c9b6dc65ea00-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.717213 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zddtm"] Dec 05 23:25:00 crc kubenswrapper[4734]: W1205 23:25:00.738332 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cccd0a8_35c0_4e22_b73c_bc9282c804b6.slice/crio-1478dc3b17e4099ddda3f1674240c19aa3249324ad2b494214e1b2ebf2f488c1 WatchSource:0}: Error finding container 1478dc3b17e4099ddda3f1674240c19aa3249324ad2b494214e1b2ebf2f488c1: Status 404 returned error can't find the container with id 1478dc3b17e4099ddda3f1674240c19aa3249324ad2b494214e1b2ebf2f488c1 Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.944128 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kdt5b" event={"ID":"5040a4a1-0b01-4581-89a7-37186c3caebe","Type":"ContainerDied","Data":"7c42ae00e0855109341d487c785732b1fc1a459cf8f62942fbbc49bb3cfc2fc2"} Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.944194 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kdt5b" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.944203 4734 scope.go:117] "RemoveContainer" containerID="8c612793464e454a21b04c406c2a3457cc0a807a995eba7d6716550f916c9b8b" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.951036 4734 generic.go:334] "Generic (PLEG): container finished" podID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" containerID="5b110ebd20ad70373883a8604089e967c00f134bcab314c5426fd46a1753c413" exitCode=0 Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.951447 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vj45t" event={"ID":"7ba0c803-1b80-4161-afa1-c9b6dc65ea00","Type":"ContainerDied","Data":"5b110ebd20ad70373883a8604089e967c00f134bcab314c5426fd46a1753c413"} Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.951502 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vj45t" event={"ID":"7ba0c803-1b80-4161-afa1-c9b6dc65ea00","Type":"ContainerDied","Data":"2eceea83ef4bedf76d62f59e6c411372ef92f8088b2d055aaa79007d15228e12"} Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.951646 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vj45t" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.956620 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gnbkp" event={"ID":"4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4","Type":"ContainerDied","Data":"f5e26c5781dac26b5b7baf898b86fe72efd6beed46b91c0f5a739d90e755820d"} Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.956830 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gnbkp" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.965059 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cb4rj" event={"ID":"597348be-fe32-4495-bb10-d152ed593e3e","Type":"ContainerDied","Data":"8485d6fd69838d4f3797b3203ebb9dd871098c143a2e8ff2b9af878a1c3e1633"} Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.965184 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cb4rj" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.966269 4734 scope.go:117] "RemoveContainer" containerID="d2bfac5855f26fc9b1373233af371cc973951d17c8e5c26ecf0de2fbcc8c8ac1" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.969971 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hmv47" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.969983 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmv47" event={"ID":"7e24c08e-fb74-4ae6-9c48-ae9653c964e8","Type":"ContainerDied","Data":"f35e82570b9df67ea7b1c3dfa062d26ada21f2ad2f14e7b74cb995ba55ebce6f"} Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.972164 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ws6qt" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.972177 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ws6qt" event={"ID":"0cd8fc51-deec-410b-b2bb-4818c2f71230","Type":"ContainerDied","Data":"75d5b89b3d055e34efd85b0c0c92d7321eb0a77cc476c9d678773513d6d4e949"} Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.974382 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zddtm" event={"ID":"8cccd0a8-35c0-4e22-b73c-bc9282c804b6","Type":"ContainerStarted","Data":"6a5a8cd91e518945dc5927f366208dffbbf9bec396287a480954c7e488d98620"} Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.974500 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zddtm" event={"ID":"8cccd0a8-35c0-4e22-b73c-bc9282c804b6","Type":"ContainerStarted","Data":"1478dc3b17e4099ddda3f1674240c19aa3249324ad2b494214e1b2ebf2f488c1"} Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.975466 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zddtm" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.978996 4734 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zddtm container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.60:8080/healthz\": dial tcp 10.217.0.60:8080: connect: connection refused" start-of-body= Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.979054 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zddtm" podUID="8cccd0a8-35c0-4e22-b73c-bc9282c804b6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.60:8080/healthz\": dial tcp 10.217.0.60:8080: connect: connection refused" Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.984677 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kdt5b"] Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.988368 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kdt5b"] Dec 05 23:25:00 crc kubenswrapper[4734]: I1205 23:25:00.988661 4734 scope.go:117] "RemoveContainer" containerID="126cf9aed4db121243f46854a2e07e9b4913773b2b0e42dc3d6ffd27cb4c3229" Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.003402 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zddtm" podStartSLOduration=2.003382746 podStartE2EDuration="2.003382746s" podCreationTimestamp="2025-12-05 23:24:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:25:01.001473606 +0000 UTC m=+321.684877892" watchObservedRunningTime="2025-12-05 23:25:01.003382746 +0000 UTC m=+321.686787022" Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.014415 4734 scope.go:117] "RemoveContainer" containerID="5b110ebd20ad70373883a8604089e967c00f134bcab314c5426fd46a1753c413" Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.035446 4734 scope.go:117] "RemoveContainer" containerID="95a8a98bccba41294acbadfb078a33cdb8cfaf47c941ac39c5a646b1b981473d" Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.049466 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vj45t"] Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.065157 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vj45t"] Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.072353 4734 scope.go:117] "RemoveContainer" containerID="29e391a7080a1b0cf8c0ed2623b9ce8b1b62511013733cee8112ccf6ede9e797" Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.074575 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hmv47"] Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.080381 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hmv47"] Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.083659 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gnbkp"] Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.086402 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gnbkp"] Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.091185 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ws6qt"] Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.094746 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ws6qt"] Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.095656 4734 scope.go:117] "RemoveContainer" containerID="5b110ebd20ad70373883a8604089e967c00f134bcab314c5426fd46a1753c413" Dec 05 23:25:01 crc kubenswrapper[4734]: E1205 23:25:01.096415 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b110ebd20ad70373883a8604089e967c00f134bcab314c5426fd46a1753c413\": container with ID starting with 5b110ebd20ad70373883a8604089e967c00f134bcab314c5426fd46a1753c413 not found: ID does not exist" containerID="5b110ebd20ad70373883a8604089e967c00f134bcab314c5426fd46a1753c413" Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.096473 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b110ebd20ad70373883a8604089e967c00f134bcab314c5426fd46a1753c413"} err="failed to get container status \"5b110ebd20ad70373883a8604089e967c00f134bcab314c5426fd46a1753c413\": rpc error: code = NotFound desc = could not find container \"5b110ebd20ad70373883a8604089e967c00f134bcab314c5426fd46a1753c413\": container with ID starting with 5b110ebd20ad70373883a8604089e967c00f134bcab314c5426fd46a1753c413 not found: ID does not exist" Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.096509 4734 scope.go:117] "RemoveContainer" containerID="95a8a98bccba41294acbadfb078a33cdb8cfaf47c941ac39c5a646b1b981473d" Dec 05 23:25:01 crc kubenswrapper[4734]: E1205 23:25:01.097481 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95a8a98bccba41294acbadfb078a33cdb8cfaf47c941ac39c5a646b1b981473d\": container with ID starting with 95a8a98bccba41294acbadfb078a33cdb8cfaf47c941ac39c5a646b1b981473d not found: ID does not exist" containerID="95a8a98bccba41294acbadfb078a33cdb8cfaf47c941ac39c5a646b1b981473d" Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.097514 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95a8a98bccba41294acbadfb078a33cdb8cfaf47c941ac39c5a646b1b981473d"} err="failed to get container status \"95a8a98bccba41294acbadfb078a33cdb8cfaf47c941ac39c5a646b1b981473d\": rpc error: code = NotFound desc = could not find container \"95a8a98bccba41294acbadfb078a33cdb8cfaf47c941ac39c5a646b1b981473d\": container with ID starting with 95a8a98bccba41294acbadfb078a33cdb8cfaf47c941ac39c5a646b1b981473d not found: ID does not exist" Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.097548 4734 scope.go:117] "RemoveContainer" containerID="29e391a7080a1b0cf8c0ed2623b9ce8b1b62511013733cee8112ccf6ede9e797" Dec 05 23:25:01 crc kubenswrapper[4734]: E1205 23:25:01.097937 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29e391a7080a1b0cf8c0ed2623b9ce8b1b62511013733cee8112ccf6ede9e797\": container with ID starting with 29e391a7080a1b0cf8c0ed2623b9ce8b1b62511013733cee8112ccf6ede9e797 not found: ID does not exist" containerID="29e391a7080a1b0cf8c0ed2623b9ce8b1b62511013733cee8112ccf6ede9e797" Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.097963 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29e391a7080a1b0cf8c0ed2623b9ce8b1b62511013733cee8112ccf6ede9e797"} err="failed to get container status \"29e391a7080a1b0cf8c0ed2623b9ce8b1b62511013733cee8112ccf6ede9e797\": rpc error: code = NotFound desc = could not find container \"29e391a7080a1b0cf8c0ed2623b9ce8b1b62511013733cee8112ccf6ede9e797\": container with ID starting with 29e391a7080a1b0cf8c0ed2623b9ce8b1b62511013733cee8112ccf6ede9e797 not found: ID does not exist" Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.097992 4734 scope.go:117] "RemoveContainer" containerID="0ad97f0b80b9db497c44d78afb09b9217c04438574d4c9f552853398eddc04ec" Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.100883 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cb4rj"] Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.115102 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cb4rj"] Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.117748 4734 scope.go:117] "RemoveContainer" containerID="5a80b971db92270e639c74e75d715e51fd9aa35033f323865636c2d7e7f770ab" Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.134590 4734 scope.go:117] "RemoveContainer" containerID="7296522c020efa2b5dc85ed3aae2059722a3c0d9f8bb93314cacd8bb82249cdb" Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.150415 4734 scope.go:117] "RemoveContainer" containerID="8f3a5e56a02fbdb8a0a5a8736fd80d28d5c42b5deb684663ff6910b3b0b752bd" Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.164839 4734 scope.go:117] "RemoveContainer" containerID="efb50aabad288de103b27689366d3b4e6b599e57b689147cbf99a5c136ba5944" Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.180661 4734 scope.go:117] "RemoveContainer" containerID="f4751cf451faedbcaa626ab80383373cfd0ca970206a2e2b2d17fcb0c313ea32" Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.199217 4734 scope.go:117] "RemoveContainer" containerID="c893a3f02e495017bbd5ae00c480a8729469b43d7ab234724e209dc00e833de8" Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.216187 4734 scope.go:117] "RemoveContainer" containerID="a7e3f3325c47bebf23eaa664ce1f52666c0ad29104a891802550e978df6f9dc9" Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.247974 4734 scope.go:117] "RemoveContainer" containerID="5abb5e424fa45208e6807d339a109cdafafa143b061bf4664d1d324e20c734a4" Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.265354 4734 scope.go:117] "RemoveContainer" containerID="06cc21fd5615317d75503b49b30fc48c2b6ea896d839b4dba4330f382f0b5f3f" Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.620594 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cd8fc51-deec-410b-b2bb-4818c2f71230" path="/var/lib/kubelet/pods/0cd8fc51-deec-410b-b2bb-4818c2f71230/volumes" Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.621311 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" path="/var/lib/kubelet/pods/4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4/volumes" Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.622154 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5040a4a1-0b01-4581-89a7-37186c3caebe" path="/var/lib/kubelet/pods/5040a4a1-0b01-4581-89a7-37186c3caebe/volumes" Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.623609 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="597348be-fe32-4495-bb10-d152ed593e3e" path="/var/lib/kubelet/pods/597348be-fe32-4495-bb10-d152ed593e3e/volumes" Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.624389 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" path="/var/lib/kubelet/pods/7ba0c803-1b80-4161-afa1-c9b6dc65ea00/volumes" Dec 05 23:25:01 crc kubenswrapper[4734]: I1205 23:25:01.625894 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e24c08e-fb74-4ae6-9c48-ae9653c964e8" path="/var/lib/kubelet/pods/7e24c08e-fb74-4ae6-9c48-ae9653c964e8/volumes" Dec 05 23:25:02 crc kubenswrapper[4734]: I1205 23:25:02.006958 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zddtm" Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.120777 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7d8d559c58-pjpvf"] Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.121715 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7d8d559c58-pjpvf" podUID="826af56b-5935-4d88-9ee6-9462c30cb589" containerName="controller-manager" containerID="cri-o://58c4dacccc89a564f0c64b58dab3c9e51bd17e297fc45e0aa3ba39453bdcc307" gracePeriod=30 Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.138925 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6"] Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.139311 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6" podUID="e2bc5562-088e-4df8-a866-e1dae67ee011" containerName="route-controller-manager" containerID="cri-o://6d9089283595447afb3e2770c69e97a3f4ed92dd6e4b15de05dba1c69bf0f8b7" gracePeriod=30 Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.716817 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6" Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.719988 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d8d559c58-pjpvf" Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.827885 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/826af56b-5935-4d88-9ee6-9462c30cb589-serving-cert\") pod \"826af56b-5935-4d88-9ee6-9462c30cb589\" (UID: \"826af56b-5935-4d88-9ee6-9462c30cb589\") " Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.827958 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2bc5562-088e-4df8-a866-e1dae67ee011-config\") pod \"e2bc5562-088e-4df8-a866-e1dae67ee011\" (UID: \"e2bc5562-088e-4df8-a866-e1dae67ee011\") " Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.828029 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62xhw\" (UniqueName: \"kubernetes.io/projected/826af56b-5935-4d88-9ee6-9462c30cb589-kube-api-access-62xhw\") pod \"826af56b-5935-4d88-9ee6-9462c30cb589\" (UID: \"826af56b-5935-4d88-9ee6-9462c30cb589\") " Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.828089 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/826af56b-5935-4d88-9ee6-9462c30cb589-client-ca\") pod \"826af56b-5935-4d88-9ee6-9462c30cb589\" (UID: \"826af56b-5935-4d88-9ee6-9462c30cb589\") " Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.828117 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e2bc5562-088e-4df8-a866-e1dae67ee011-client-ca\") pod \"e2bc5562-088e-4df8-a866-e1dae67ee011\" (UID: \"e2bc5562-088e-4df8-a866-e1dae67ee011\") " Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.828166 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/826af56b-5935-4d88-9ee6-9462c30cb589-config\") pod \"826af56b-5935-4d88-9ee6-9462c30cb589\" (UID: \"826af56b-5935-4d88-9ee6-9462c30cb589\") " Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.828230 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/826af56b-5935-4d88-9ee6-9462c30cb589-proxy-ca-bundles\") pod \"826af56b-5935-4d88-9ee6-9462c30cb589\" (UID: \"826af56b-5935-4d88-9ee6-9462c30cb589\") " Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.828256 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt8j8\" (UniqueName: \"kubernetes.io/projected/e2bc5562-088e-4df8-a866-e1dae67ee011-kube-api-access-xt8j8\") pod \"e2bc5562-088e-4df8-a866-e1dae67ee011\" (UID: \"e2bc5562-088e-4df8-a866-e1dae67ee011\") " Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.828281 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2bc5562-088e-4df8-a866-e1dae67ee011-serving-cert\") pod \"e2bc5562-088e-4df8-a866-e1dae67ee011\" (UID: \"e2bc5562-088e-4df8-a866-e1dae67ee011\") " Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.830400 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/826af56b-5935-4d88-9ee6-9462c30cb589-client-ca" (OuterVolumeSpecName: "client-ca") pod "826af56b-5935-4d88-9ee6-9462c30cb589" (UID: "826af56b-5935-4d88-9ee6-9462c30cb589"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.830834 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/826af56b-5935-4d88-9ee6-9462c30cb589-config" (OuterVolumeSpecName: "config") pod "826af56b-5935-4d88-9ee6-9462c30cb589" (UID: "826af56b-5935-4d88-9ee6-9462c30cb589"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.830980 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2bc5562-088e-4df8-a866-e1dae67ee011-client-ca" (OuterVolumeSpecName: "client-ca") pod "e2bc5562-088e-4df8-a866-e1dae67ee011" (UID: "e2bc5562-088e-4df8-a866-e1dae67ee011"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.831361 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/826af56b-5935-4d88-9ee6-9462c30cb589-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "826af56b-5935-4d88-9ee6-9462c30cb589" (UID: "826af56b-5935-4d88-9ee6-9462c30cb589"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.832821 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2bc5562-088e-4df8-a866-e1dae67ee011-config" (OuterVolumeSpecName: "config") pod "e2bc5562-088e-4df8-a866-e1dae67ee011" (UID: "e2bc5562-088e-4df8-a866-e1dae67ee011"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.837447 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/826af56b-5935-4d88-9ee6-9462c30cb589-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "826af56b-5935-4d88-9ee6-9462c30cb589" (UID: "826af56b-5935-4d88-9ee6-9462c30cb589"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.837502 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2bc5562-088e-4df8-a866-e1dae67ee011-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e2bc5562-088e-4df8-a866-e1dae67ee011" (UID: "e2bc5562-088e-4df8-a866-e1dae67ee011"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.837546 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2bc5562-088e-4df8-a866-e1dae67ee011-kube-api-access-xt8j8" (OuterVolumeSpecName: "kube-api-access-xt8j8") pod "e2bc5562-088e-4df8-a866-e1dae67ee011" (UID: "e2bc5562-088e-4df8-a866-e1dae67ee011"). InnerVolumeSpecName "kube-api-access-xt8j8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.839232 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/826af56b-5935-4d88-9ee6-9462c30cb589-kube-api-access-62xhw" (OuterVolumeSpecName: "kube-api-access-62xhw") pod "826af56b-5935-4d88-9ee6-9462c30cb589" (UID: "826af56b-5935-4d88-9ee6-9462c30cb589"). InnerVolumeSpecName "kube-api-access-62xhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.929259 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62xhw\" (UniqueName: \"kubernetes.io/projected/826af56b-5935-4d88-9ee6-9462c30cb589-kube-api-access-62xhw\") on node \"crc\" DevicePath \"\"" Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.929304 4734 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/826af56b-5935-4d88-9ee6-9462c30cb589-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.929321 4734 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e2bc5562-088e-4df8-a866-e1dae67ee011-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.929332 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/826af56b-5935-4d88-9ee6-9462c30cb589-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.929346 4734 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/826af56b-5935-4d88-9ee6-9462c30cb589-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.929363 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt8j8\" (UniqueName: \"kubernetes.io/projected/e2bc5562-088e-4df8-a866-e1dae67ee011-kube-api-access-xt8j8\") on node \"crc\" DevicePath \"\"" Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.929376 4734 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2bc5562-088e-4df8-a866-e1dae67ee011-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.929389 4734 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/826af56b-5935-4d88-9ee6-9462c30cb589-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:25:15 crc kubenswrapper[4734]: I1205 23:25:15.929402 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2bc5562-088e-4df8-a866-e1dae67ee011-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.083203 4734 generic.go:334] "Generic (PLEG): container finished" podID="e2bc5562-088e-4df8-a866-e1dae67ee011" containerID="6d9089283595447afb3e2770c69e97a3f4ed92dd6e4b15de05dba1c69bf0f8b7" exitCode=0 Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.083286 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6" event={"ID":"e2bc5562-088e-4df8-a866-e1dae67ee011","Type":"ContainerDied","Data":"6d9089283595447afb3e2770c69e97a3f4ed92dd6e4b15de05dba1c69bf0f8b7"} Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.083331 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6" event={"ID":"e2bc5562-088e-4df8-a866-e1dae67ee011","Type":"ContainerDied","Data":"aa5d222bd84db7ed2940eab75c1addf7b2b0d71885847f891bb3bd9f3750b5d2"} Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.083356 4734 scope.go:117] "RemoveContainer" containerID="6d9089283595447afb3e2770c69e97a3f4ed92dd6e4b15de05dba1c69bf0f8b7" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.083573 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.092693 4734 generic.go:334] "Generic (PLEG): container finished" podID="826af56b-5935-4d88-9ee6-9462c30cb589" containerID="58c4dacccc89a564f0c64b58dab3c9e51bd17e297fc45e0aa3ba39453bdcc307" exitCode=0 Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.092744 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d8d559c58-pjpvf" event={"ID":"826af56b-5935-4d88-9ee6-9462c30cb589","Type":"ContainerDied","Data":"58c4dacccc89a564f0c64b58dab3c9e51bd17e297fc45e0aa3ba39453bdcc307"} Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.092777 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d8d559c58-pjpvf" event={"ID":"826af56b-5935-4d88-9ee6-9462c30cb589","Type":"ContainerDied","Data":"9efc5683e777ea508fa2e784120f5e5e231e9a3fa8ad2c025ce9949e811bbfc5"} Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.092770 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d8d559c58-pjpvf" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.118298 4734 scope.go:117] "RemoveContainer" containerID="6d9089283595447afb3e2770c69e97a3f4ed92dd6e4b15de05dba1c69bf0f8b7" Dec 05 23:25:16 crc kubenswrapper[4734]: E1205 23:25:16.119760 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d9089283595447afb3e2770c69e97a3f4ed92dd6e4b15de05dba1c69bf0f8b7\": container with ID starting with 6d9089283595447afb3e2770c69e97a3f4ed92dd6e4b15de05dba1c69bf0f8b7 not found: ID does not exist" containerID="6d9089283595447afb3e2770c69e97a3f4ed92dd6e4b15de05dba1c69bf0f8b7" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.119825 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d9089283595447afb3e2770c69e97a3f4ed92dd6e4b15de05dba1c69bf0f8b7"} err="failed to get container status \"6d9089283595447afb3e2770c69e97a3f4ed92dd6e4b15de05dba1c69bf0f8b7\": rpc error: code = NotFound desc = could not find container \"6d9089283595447afb3e2770c69e97a3f4ed92dd6e4b15de05dba1c69bf0f8b7\": container with ID starting with 6d9089283595447afb3e2770c69e97a3f4ed92dd6e4b15de05dba1c69bf0f8b7 not found: ID does not exist" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.119859 4734 scope.go:117] "RemoveContainer" containerID="58c4dacccc89a564f0c64b58dab3c9e51bd17e297fc45e0aa3ba39453bdcc307" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.121255 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6"] Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.130515 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c9d9ff4b4-gkzt6"] Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.137456 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7d8d559c58-pjpvf"] Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.140756 4734 scope.go:117] "RemoveContainer" containerID="58c4dacccc89a564f0c64b58dab3c9e51bd17e297fc45e0aa3ba39453bdcc307" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.142463 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7d8d559c58-pjpvf"] Dec 05 23:25:16 crc kubenswrapper[4734]: E1205 23:25:16.142891 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58c4dacccc89a564f0c64b58dab3c9e51bd17e297fc45e0aa3ba39453bdcc307\": container with ID starting with 58c4dacccc89a564f0c64b58dab3c9e51bd17e297fc45e0aa3ba39453bdcc307 not found: ID does not exist" containerID="58c4dacccc89a564f0c64b58dab3c9e51bd17e297fc45e0aa3ba39453bdcc307" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.143052 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58c4dacccc89a564f0c64b58dab3c9e51bd17e297fc45e0aa3ba39453bdcc307"} err="failed to get container status \"58c4dacccc89a564f0c64b58dab3c9e51bd17e297fc45e0aa3ba39453bdcc307\": rpc error: code = NotFound desc = could not find container \"58c4dacccc89a564f0c64b58dab3c9e51bd17e297fc45e0aa3ba39453bdcc307\": container with ID starting with 58c4dacccc89a564f0c64b58dab3c9e51bd17e297fc45e0aa3ba39453bdcc307 not found: ID does not exist" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.696986 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-76c476bb96-45nfj"] Dec 05 23:25:16 crc kubenswrapper[4734]: E1205 23:25:16.697397 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" containerName="extract-content" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.697414 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" containerName="extract-content" Dec 05 23:25:16 crc kubenswrapper[4734]: E1205 23:25:16.697430 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e24c08e-fb74-4ae6-9c48-ae9653c964e8" containerName="extract-content" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.697437 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e24c08e-fb74-4ae6-9c48-ae9653c964e8" containerName="extract-content" Dec 05 23:25:16 crc kubenswrapper[4734]: E1205 23:25:16.697449 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="597348be-fe32-4495-bb10-d152ed593e3e" containerName="extract-utilities" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.697460 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="597348be-fe32-4495-bb10-d152ed593e3e" containerName="extract-utilities" Dec 05 23:25:16 crc kubenswrapper[4734]: E1205 23:25:16.697478 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" containerName="registry-server" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.697486 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" containerName="registry-server" Dec 05 23:25:16 crc kubenswrapper[4734]: E1205 23:25:16.697496 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bc5562-088e-4df8-a866-e1dae67ee011" containerName="route-controller-manager" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.697507 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bc5562-088e-4df8-a866-e1dae67ee011" containerName="route-controller-manager" Dec 05 23:25:16 crc kubenswrapper[4734]: E1205 23:25:16.697548 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="597348be-fe32-4495-bb10-d152ed593e3e" containerName="extract-content" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.697558 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="597348be-fe32-4495-bb10-d152ed593e3e" containerName="extract-content" Dec 05 23:25:16 crc kubenswrapper[4734]: E1205 23:25:16.697569 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" containerName="extract-utilities" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.697576 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" containerName="extract-utilities" Dec 05 23:25:16 crc kubenswrapper[4734]: E1205 23:25:16.697592 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5040a4a1-0b01-4581-89a7-37186c3caebe" containerName="registry-server" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.697603 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="5040a4a1-0b01-4581-89a7-37186c3caebe" containerName="registry-server" Dec 05 23:25:16 crc kubenswrapper[4734]: E1205 23:25:16.697620 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e24c08e-fb74-4ae6-9c48-ae9653c964e8" containerName="registry-server" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.697628 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e24c08e-fb74-4ae6-9c48-ae9653c964e8" containerName="registry-server" Dec 05 23:25:16 crc kubenswrapper[4734]: E1205 23:25:16.697639 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" containerName="registry-server" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.697646 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" containerName="registry-server" Dec 05 23:25:16 crc kubenswrapper[4734]: E1205 23:25:16.697657 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cd8fc51-deec-410b-b2bb-4818c2f71230" containerName="marketplace-operator" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.697665 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cd8fc51-deec-410b-b2bb-4818c2f71230" containerName="marketplace-operator" Dec 05 23:25:16 crc kubenswrapper[4734]: E1205 23:25:16.697673 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5040a4a1-0b01-4581-89a7-37186c3caebe" containerName="extract-utilities" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.697681 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="5040a4a1-0b01-4581-89a7-37186c3caebe" containerName="extract-utilities" Dec 05 23:25:16 crc kubenswrapper[4734]: E1205 23:25:16.697690 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="826af56b-5935-4d88-9ee6-9462c30cb589" containerName="controller-manager" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.697697 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="826af56b-5935-4d88-9ee6-9462c30cb589" containerName="controller-manager" Dec 05 23:25:16 crc kubenswrapper[4734]: E1205 23:25:16.697706 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" containerName="extract-utilities" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.697713 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" containerName="extract-utilities" Dec 05 23:25:16 crc kubenswrapper[4734]: E1205 23:25:16.697725 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cd8fc51-deec-410b-b2bb-4818c2f71230" containerName="marketplace-operator" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.697732 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cd8fc51-deec-410b-b2bb-4818c2f71230" containerName="marketplace-operator" Dec 05 23:25:16 crc kubenswrapper[4734]: E1205 23:25:16.697742 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5040a4a1-0b01-4581-89a7-37186c3caebe" containerName="extract-content" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.697749 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="5040a4a1-0b01-4581-89a7-37186c3caebe" containerName="extract-content" Dec 05 23:25:16 crc kubenswrapper[4734]: E1205 23:25:16.697759 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" containerName="extract-content" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.697766 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" containerName="extract-content" Dec 05 23:25:16 crc kubenswrapper[4734]: E1205 23:25:16.697773 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="597348be-fe32-4495-bb10-d152ed593e3e" containerName="registry-server" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.697779 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="597348be-fe32-4495-bb10-d152ed593e3e" containerName="registry-server" Dec 05 23:25:16 crc kubenswrapper[4734]: E1205 23:25:16.697788 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e24c08e-fb74-4ae6-9c48-ae9653c964e8" containerName="extract-utilities" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.697793 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e24c08e-fb74-4ae6-9c48-ae9653c964e8" containerName="extract-utilities" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.697917 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="597348be-fe32-4495-bb10-d152ed593e3e" containerName="registry-server" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.697929 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="5040a4a1-0b01-4581-89a7-37186c3caebe" containerName="registry-server" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.697938 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ba0c803-1b80-4161-afa1-c9b6dc65ea00" containerName="registry-server" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.697947 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b7ae74a-0552-4e8b-9ce4-b8b9e5f389b4" containerName="registry-server" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.697955 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cd8fc51-deec-410b-b2bb-4818c2f71230" containerName="marketplace-operator" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.697966 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cd8fc51-deec-410b-b2bb-4818c2f71230" containerName="marketplace-operator" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.697973 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="826af56b-5935-4d88-9ee6-9462c30cb589" containerName="controller-manager" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.697982 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e24c08e-fb74-4ae6-9c48-ae9653c964e8" containerName="registry-server" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.697993 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bc5562-088e-4df8-a866-e1dae67ee011" containerName="route-controller-manager" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.700351 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76c476bb96-45nfj" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.701911 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-769556f88d-59fqk"] Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.702950 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-769556f88d-59fqk" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.709204 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.709578 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.709685 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.709723 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.709671 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.709867 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.710167 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.710654 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.710838 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.711120 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.710973 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.711773 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.716359 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.726380 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76c476bb96-45nfj"] Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.732805 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-769556f88d-59fqk"] Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.740715 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gw4w\" (UniqueName: \"kubernetes.io/projected/a5ad0e8a-529b-4b7e-a71a-54cff83de295-kube-api-access-6gw4w\") pod \"controller-manager-76c476bb96-45nfj\" (UID: \"a5ad0e8a-529b-4b7e-a71a-54cff83de295\") " pod="openshift-controller-manager/controller-manager-76c476bb96-45nfj" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.740786 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65818d6-ac70-430a-bf2d-3d2d46fa01b5-config\") pod \"route-controller-manager-769556f88d-59fqk\" (UID: \"f65818d6-ac70-430a-bf2d-3d2d46fa01b5\") " pod="openshift-route-controller-manager/route-controller-manager-769556f88d-59fqk" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.740818 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5ad0e8a-529b-4b7e-a71a-54cff83de295-config\") pod \"controller-manager-76c476bb96-45nfj\" (UID: \"a5ad0e8a-529b-4b7e-a71a-54cff83de295\") " pod="openshift-controller-manager/controller-manager-76c476bb96-45nfj" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.740846 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a5ad0e8a-529b-4b7e-a71a-54cff83de295-proxy-ca-bundles\") pod \"controller-manager-76c476bb96-45nfj\" (UID: \"a5ad0e8a-529b-4b7e-a71a-54cff83de295\") " pod="openshift-controller-manager/controller-manager-76c476bb96-45nfj" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.740884 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f65818d6-ac70-430a-bf2d-3d2d46fa01b5-serving-cert\") pod \"route-controller-manager-769556f88d-59fqk\" (UID: \"f65818d6-ac70-430a-bf2d-3d2d46fa01b5\") " pod="openshift-route-controller-manager/route-controller-manager-769556f88d-59fqk" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.740908 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5ad0e8a-529b-4b7e-a71a-54cff83de295-serving-cert\") pod \"controller-manager-76c476bb96-45nfj\" (UID: \"a5ad0e8a-529b-4b7e-a71a-54cff83de295\") " pod="openshift-controller-manager/controller-manager-76c476bb96-45nfj" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.741182 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f65818d6-ac70-430a-bf2d-3d2d46fa01b5-client-ca\") pod \"route-controller-manager-769556f88d-59fqk\" (UID: \"f65818d6-ac70-430a-bf2d-3d2d46fa01b5\") " pod="openshift-route-controller-manager/route-controller-manager-769556f88d-59fqk" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.741292 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk9vl\" (UniqueName: \"kubernetes.io/projected/f65818d6-ac70-430a-bf2d-3d2d46fa01b5-kube-api-access-xk9vl\") pod \"route-controller-manager-769556f88d-59fqk\" (UID: \"f65818d6-ac70-430a-bf2d-3d2d46fa01b5\") " pod="openshift-route-controller-manager/route-controller-manager-769556f88d-59fqk" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.741327 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5ad0e8a-529b-4b7e-a71a-54cff83de295-client-ca\") pod \"controller-manager-76c476bb96-45nfj\" (UID: \"a5ad0e8a-529b-4b7e-a71a-54cff83de295\") " pod="openshift-controller-manager/controller-manager-76c476bb96-45nfj" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.842759 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk9vl\" (UniqueName: \"kubernetes.io/projected/f65818d6-ac70-430a-bf2d-3d2d46fa01b5-kube-api-access-xk9vl\") pod \"route-controller-manager-769556f88d-59fqk\" (UID: \"f65818d6-ac70-430a-bf2d-3d2d46fa01b5\") " pod="openshift-route-controller-manager/route-controller-manager-769556f88d-59fqk" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.843129 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5ad0e8a-529b-4b7e-a71a-54cff83de295-client-ca\") pod \"controller-manager-76c476bb96-45nfj\" (UID: \"a5ad0e8a-529b-4b7e-a71a-54cff83de295\") " pod="openshift-controller-manager/controller-manager-76c476bb96-45nfj" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.843319 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gw4w\" (UniqueName: \"kubernetes.io/projected/a5ad0e8a-529b-4b7e-a71a-54cff83de295-kube-api-access-6gw4w\") pod \"controller-manager-76c476bb96-45nfj\" (UID: \"a5ad0e8a-529b-4b7e-a71a-54cff83de295\") " pod="openshift-controller-manager/controller-manager-76c476bb96-45nfj" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.843454 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65818d6-ac70-430a-bf2d-3d2d46fa01b5-config\") pod \"route-controller-manager-769556f88d-59fqk\" (UID: \"f65818d6-ac70-430a-bf2d-3d2d46fa01b5\") " pod="openshift-route-controller-manager/route-controller-manager-769556f88d-59fqk" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.843621 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5ad0e8a-529b-4b7e-a71a-54cff83de295-config\") pod \"controller-manager-76c476bb96-45nfj\" (UID: \"a5ad0e8a-529b-4b7e-a71a-54cff83de295\") " pod="openshift-controller-manager/controller-manager-76c476bb96-45nfj" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.843755 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a5ad0e8a-529b-4b7e-a71a-54cff83de295-proxy-ca-bundles\") pod \"controller-manager-76c476bb96-45nfj\" (UID: \"a5ad0e8a-529b-4b7e-a71a-54cff83de295\") " pod="openshift-controller-manager/controller-manager-76c476bb96-45nfj" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.843863 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f65818d6-ac70-430a-bf2d-3d2d46fa01b5-serving-cert\") pod \"route-controller-manager-769556f88d-59fqk\" (UID: \"f65818d6-ac70-430a-bf2d-3d2d46fa01b5\") " pod="openshift-route-controller-manager/route-controller-manager-769556f88d-59fqk" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.843962 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5ad0e8a-529b-4b7e-a71a-54cff83de295-serving-cert\") pod \"controller-manager-76c476bb96-45nfj\" (UID: \"a5ad0e8a-529b-4b7e-a71a-54cff83de295\") " pod="openshift-controller-manager/controller-manager-76c476bb96-45nfj" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.844076 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f65818d6-ac70-430a-bf2d-3d2d46fa01b5-client-ca\") pod \"route-controller-manager-769556f88d-59fqk\" (UID: \"f65818d6-ac70-430a-bf2d-3d2d46fa01b5\") " pod="openshift-route-controller-manager/route-controller-manager-769556f88d-59fqk" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.844766 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65818d6-ac70-430a-bf2d-3d2d46fa01b5-config\") pod \"route-controller-manager-769556f88d-59fqk\" (UID: \"f65818d6-ac70-430a-bf2d-3d2d46fa01b5\") " pod="openshift-route-controller-manager/route-controller-manager-769556f88d-59fqk" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.845401 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f65818d6-ac70-430a-bf2d-3d2d46fa01b5-client-ca\") pod \"route-controller-manager-769556f88d-59fqk\" (UID: \"f65818d6-ac70-430a-bf2d-3d2d46fa01b5\") " pod="openshift-route-controller-manager/route-controller-manager-769556f88d-59fqk" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.845628 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5ad0e8a-529b-4b7e-a71a-54cff83de295-client-ca\") pod \"controller-manager-76c476bb96-45nfj\" (UID: \"a5ad0e8a-529b-4b7e-a71a-54cff83de295\") " pod="openshift-controller-manager/controller-manager-76c476bb96-45nfj" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.849132 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5ad0e8a-529b-4b7e-a71a-54cff83de295-config\") pod \"controller-manager-76c476bb96-45nfj\" (UID: \"a5ad0e8a-529b-4b7e-a71a-54cff83de295\") " pod="openshift-controller-manager/controller-manager-76c476bb96-45nfj" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.849804 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a5ad0e8a-529b-4b7e-a71a-54cff83de295-proxy-ca-bundles\") pod \"controller-manager-76c476bb96-45nfj\" (UID: \"a5ad0e8a-529b-4b7e-a71a-54cff83de295\") " pod="openshift-controller-manager/controller-manager-76c476bb96-45nfj" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.850629 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f65818d6-ac70-430a-bf2d-3d2d46fa01b5-serving-cert\") pod \"route-controller-manager-769556f88d-59fqk\" (UID: \"f65818d6-ac70-430a-bf2d-3d2d46fa01b5\") " pod="openshift-route-controller-manager/route-controller-manager-769556f88d-59fqk" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.851242 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5ad0e8a-529b-4b7e-a71a-54cff83de295-serving-cert\") pod \"controller-manager-76c476bb96-45nfj\" (UID: \"a5ad0e8a-529b-4b7e-a71a-54cff83de295\") " pod="openshift-controller-manager/controller-manager-76c476bb96-45nfj" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.861545 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gw4w\" (UniqueName: \"kubernetes.io/projected/a5ad0e8a-529b-4b7e-a71a-54cff83de295-kube-api-access-6gw4w\") pod \"controller-manager-76c476bb96-45nfj\" (UID: \"a5ad0e8a-529b-4b7e-a71a-54cff83de295\") " pod="openshift-controller-manager/controller-manager-76c476bb96-45nfj" Dec 05 23:25:16 crc kubenswrapper[4734]: I1205 23:25:16.864057 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk9vl\" (UniqueName: \"kubernetes.io/projected/f65818d6-ac70-430a-bf2d-3d2d46fa01b5-kube-api-access-xk9vl\") pod \"route-controller-manager-769556f88d-59fqk\" (UID: \"f65818d6-ac70-430a-bf2d-3d2d46fa01b5\") " pod="openshift-route-controller-manager/route-controller-manager-769556f88d-59fqk" Dec 05 23:25:17 crc kubenswrapper[4734]: I1205 23:25:17.036977 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76c476bb96-45nfj" Dec 05 23:25:17 crc kubenswrapper[4734]: I1205 23:25:17.040785 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-769556f88d-59fqk" Dec 05 23:25:17 crc kubenswrapper[4734]: I1205 23:25:17.498630 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-769556f88d-59fqk"] Dec 05 23:25:17 crc kubenswrapper[4734]: W1205 23:25:17.507717 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf65818d6_ac70_430a_bf2d_3d2d46fa01b5.slice/crio-804452b72ffc72a7cca22b16af283164e761f8bb8f796b7df208f11750f62fd7 WatchSource:0}: Error finding container 804452b72ffc72a7cca22b16af283164e761f8bb8f796b7df208f11750f62fd7: Status 404 returned error can't find the container with id 804452b72ffc72a7cca22b16af283164e761f8bb8f796b7df208f11750f62fd7 Dec 05 23:25:17 crc kubenswrapper[4734]: I1205 23:25:17.581601 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76c476bb96-45nfj"] Dec 05 23:25:17 crc kubenswrapper[4734]: W1205 23:25:17.588980 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5ad0e8a_529b_4b7e_a71a_54cff83de295.slice/crio-d6ea36324ffdab3f124179cda6e260d785ed60d8dfb5f2fd56d5bfa182085fda WatchSource:0}: Error finding container d6ea36324ffdab3f124179cda6e260d785ed60d8dfb5f2fd56d5bfa182085fda: Status 404 returned error can't find the container with id d6ea36324ffdab3f124179cda6e260d785ed60d8dfb5f2fd56d5bfa182085fda Dec 05 23:25:17 crc kubenswrapper[4734]: I1205 23:25:17.630054 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="826af56b-5935-4d88-9ee6-9462c30cb589" path="/var/lib/kubelet/pods/826af56b-5935-4d88-9ee6-9462c30cb589/volumes" Dec 05 23:25:17 crc kubenswrapper[4734]: I1205 23:25:17.631369 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2bc5562-088e-4df8-a866-e1dae67ee011" path="/var/lib/kubelet/pods/e2bc5562-088e-4df8-a866-e1dae67ee011/volumes" Dec 05 23:25:18 crc kubenswrapper[4734]: I1205 23:25:18.113981 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-769556f88d-59fqk" event={"ID":"f65818d6-ac70-430a-bf2d-3d2d46fa01b5","Type":"ContainerStarted","Data":"03bae04341545e6cde5cdf9c6ac01ef431af71501325f2d08111c47f8387678b"} Dec 05 23:25:18 crc kubenswrapper[4734]: I1205 23:25:18.114047 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-769556f88d-59fqk" Dec 05 23:25:18 crc kubenswrapper[4734]: I1205 23:25:18.114060 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-769556f88d-59fqk" event={"ID":"f65818d6-ac70-430a-bf2d-3d2d46fa01b5","Type":"ContainerStarted","Data":"804452b72ffc72a7cca22b16af283164e761f8bb8f796b7df208f11750f62fd7"} Dec 05 23:25:18 crc kubenswrapper[4734]: I1205 23:25:18.116003 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76c476bb96-45nfj" event={"ID":"a5ad0e8a-529b-4b7e-a71a-54cff83de295","Type":"ContainerStarted","Data":"bcca1761cd2fee23b448f8d7e85ff10327d8d83c260f91debfefc12f6f3c0d7e"} Dec 05 23:25:18 crc kubenswrapper[4734]: I1205 23:25:18.116076 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76c476bb96-45nfj" event={"ID":"a5ad0e8a-529b-4b7e-a71a-54cff83de295","Type":"ContainerStarted","Data":"d6ea36324ffdab3f124179cda6e260d785ed60d8dfb5f2fd56d5bfa182085fda"} Dec 05 23:25:18 crc kubenswrapper[4734]: I1205 23:25:18.116232 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-76c476bb96-45nfj" Dec 05 23:25:18 crc kubenswrapper[4734]: I1205 23:25:18.119380 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-769556f88d-59fqk" Dec 05 23:25:18 crc kubenswrapper[4734]: I1205 23:25:18.121317 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-76c476bb96-45nfj" Dec 05 23:25:18 crc kubenswrapper[4734]: I1205 23:25:18.136692 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-769556f88d-59fqk" podStartSLOduration=3.136646181 podStartE2EDuration="3.136646181s" podCreationTimestamp="2025-12-05 23:25:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:25:18.132213668 +0000 UTC m=+338.815617954" watchObservedRunningTime="2025-12-05 23:25:18.136646181 +0000 UTC m=+338.820050457" Dec 05 23:25:18 crc kubenswrapper[4734]: I1205 23:25:18.183037 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-76c476bb96-45nfj" podStartSLOduration=3.183011691 podStartE2EDuration="3.183011691s" podCreationTimestamp="2025-12-05 23:25:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:25:18.181233755 +0000 UTC m=+338.864638051" watchObservedRunningTime="2025-12-05 23:25:18.183011691 +0000 UTC m=+338.866415957" Dec 05 23:25:20 crc kubenswrapper[4734]: I1205 23:25:20.444901 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:25:20 crc kubenswrapper[4734]: I1205 23:25:20.445000 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:25:28 crc kubenswrapper[4734]: I1205 23:25:28.078976 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zvcrv"] Dec 05 23:25:28 crc kubenswrapper[4734]: I1205 23:25:28.081637 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zvcrv" Dec 05 23:25:28 crc kubenswrapper[4734]: I1205 23:25:28.084400 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 23:25:28 crc kubenswrapper[4734]: I1205 23:25:28.092433 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zvcrv"] Dec 05 23:25:28 crc kubenswrapper[4734]: I1205 23:25:28.198217 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03642c63-70ad-48c4-9fa1-c0e8d2d0d067-catalog-content\") pod \"certified-operators-zvcrv\" (UID: \"03642c63-70ad-48c4-9fa1-c0e8d2d0d067\") " pod="openshift-marketplace/certified-operators-zvcrv" Dec 05 23:25:28 crc kubenswrapper[4734]: I1205 23:25:28.198283 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5lbd\" (UniqueName: \"kubernetes.io/projected/03642c63-70ad-48c4-9fa1-c0e8d2d0d067-kube-api-access-p5lbd\") pod \"certified-operators-zvcrv\" (UID: \"03642c63-70ad-48c4-9fa1-c0e8d2d0d067\") " pod="openshift-marketplace/certified-operators-zvcrv" Dec 05 23:25:28 crc kubenswrapper[4734]: I1205 23:25:28.198309 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03642c63-70ad-48c4-9fa1-c0e8d2d0d067-utilities\") pod \"certified-operators-zvcrv\" (UID: \"03642c63-70ad-48c4-9fa1-c0e8d2d0d067\") " pod="openshift-marketplace/certified-operators-zvcrv" Dec 05 23:25:28 crc kubenswrapper[4734]: I1205 23:25:28.268386 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bspnj"] Dec 05 23:25:28 crc kubenswrapper[4734]: I1205 23:25:28.269807 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bspnj" Dec 05 23:25:28 crc kubenswrapper[4734]: W1205 23:25:28.276129 4734 reflector.go:561] object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh": failed to list *v1.Secret: secrets "redhat-operators-dockercfg-ct8rh" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-marketplace": no relationship found between node 'crc' and this object Dec 05 23:25:28 crc kubenswrapper[4734]: E1205 23:25:28.276189 4734 reflector.go:158] "Unhandled Error" err="object-\"openshift-marketplace\"/\"redhat-operators-dockercfg-ct8rh\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"redhat-operators-dockercfg-ct8rh\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-marketplace\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 23:25:28 crc kubenswrapper[4734]: I1205 23:25:28.290166 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bspnj"] Dec 05 23:25:28 crc kubenswrapper[4734]: I1205 23:25:28.300789 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5lbd\" (UniqueName: \"kubernetes.io/projected/03642c63-70ad-48c4-9fa1-c0e8d2d0d067-kube-api-access-p5lbd\") pod \"certified-operators-zvcrv\" (UID: \"03642c63-70ad-48c4-9fa1-c0e8d2d0d067\") " pod="openshift-marketplace/certified-operators-zvcrv" Dec 05 23:25:28 crc kubenswrapper[4734]: I1205 23:25:28.300858 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03642c63-70ad-48c4-9fa1-c0e8d2d0d067-utilities\") pod \"certified-operators-zvcrv\" (UID: \"03642c63-70ad-48c4-9fa1-c0e8d2d0d067\") " pod="openshift-marketplace/certified-operators-zvcrv" Dec 05 23:25:28 crc kubenswrapper[4734]: I1205 23:25:28.300931 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03642c63-70ad-48c4-9fa1-c0e8d2d0d067-catalog-content\") pod \"certified-operators-zvcrv\" (UID: \"03642c63-70ad-48c4-9fa1-c0e8d2d0d067\") " pod="openshift-marketplace/certified-operators-zvcrv" Dec 05 23:25:28 crc kubenswrapper[4734]: I1205 23:25:28.301739 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03642c63-70ad-48c4-9fa1-c0e8d2d0d067-utilities\") pod \"certified-operators-zvcrv\" (UID: \"03642c63-70ad-48c4-9fa1-c0e8d2d0d067\") " pod="openshift-marketplace/certified-operators-zvcrv" Dec 05 23:25:28 crc kubenswrapper[4734]: I1205 23:25:28.302281 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03642c63-70ad-48c4-9fa1-c0e8d2d0d067-catalog-content\") pod \"certified-operators-zvcrv\" (UID: \"03642c63-70ad-48c4-9fa1-c0e8d2d0d067\") " pod="openshift-marketplace/certified-operators-zvcrv" Dec 05 23:25:28 crc kubenswrapper[4734]: I1205 23:25:28.342669 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5lbd\" (UniqueName: \"kubernetes.io/projected/03642c63-70ad-48c4-9fa1-c0e8d2d0d067-kube-api-access-p5lbd\") pod \"certified-operators-zvcrv\" (UID: \"03642c63-70ad-48c4-9fa1-c0e8d2d0d067\") " pod="openshift-marketplace/certified-operators-zvcrv" Dec 05 23:25:28 crc kubenswrapper[4734]: I1205 23:25:28.402837 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e82acab-ae84-48a7-83bd-7f83a96e3f7f-utilities\") pod \"redhat-operators-bspnj\" (UID: \"9e82acab-ae84-48a7-83bd-7f83a96e3f7f\") " pod="openshift-marketplace/redhat-operators-bspnj" Dec 05 23:25:28 crc kubenswrapper[4734]: I1205 23:25:28.403384 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e82acab-ae84-48a7-83bd-7f83a96e3f7f-catalog-content\") pod \"redhat-operators-bspnj\" (UID: \"9e82acab-ae84-48a7-83bd-7f83a96e3f7f\") " pod="openshift-marketplace/redhat-operators-bspnj" Dec 05 23:25:28 crc kubenswrapper[4734]: I1205 23:25:28.403423 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjg2k\" (UniqueName: \"kubernetes.io/projected/9e82acab-ae84-48a7-83bd-7f83a96e3f7f-kube-api-access-kjg2k\") pod \"redhat-operators-bspnj\" (UID: \"9e82acab-ae84-48a7-83bd-7f83a96e3f7f\") " pod="openshift-marketplace/redhat-operators-bspnj" Dec 05 23:25:28 crc kubenswrapper[4734]: I1205 23:25:28.404580 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zvcrv" Dec 05 23:25:28 crc kubenswrapper[4734]: I1205 23:25:28.505130 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e82acab-ae84-48a7-83bd-7f83a96e3f7f-catalog-content\") pod \"redhat-operators-bspnj\" (UID: \"9e82acab-ae84-48a7-83bd-7f83a96e3f7f\") " pod="openshift-marketplace/redhat-operators-bspnj" Dec 05 23:25:28 crc kubenswrapper[4734]: I1205 23:25:28.505250 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjg2k\" (UniqueName: \"kubernetes.io/projected/9e82acab-ae84-48a7-83bd-7f83a96e3f7f-kube-api-access-kjg2k\") pod \"redhat-operators-bspnj\" (UID: \"9e82acab-ae84-48a7-83bd-7f83a96e3f7f\") " pod="openshift-marketplace/redhat-operators-bspnj" Dec 05 23:25:28 crc kubenswrapper[4734]: I1205 23:25:28.505345 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e82acab-ae84-48a7-83bd-7f83a96e3f7f-utilities\") pod \"redhat-operators-bspnj\" (UID: \"9e82acab-ae84-48a7-83bd-7f83a96e3f7f\") " pod="openshift-marketplace/redhat-operators-bspnj" Dec 05 23:25:28 crc kubenswrapper[4734]: I1205 23:25:28.506820 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e82acab-ae84-48a7-83bd-7f83a96e3f7f-utilities\") pod \"redhat-operators-bspnj\" (UID: \"9e82acab-ae84-48a7-83bd-7f83a96e3f7f\") " pod="openshift-marketplace/redhat-operators-bspnj" Dec 05 23:25:28 crc kubenswrapper[4734]: I1205 23:25:28.508457 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e82acab-ae84-48a7-83bd-7f83a96e3f7f-catalog-content\") pod \"redhat-operators-bspnj\" (UID: \"9e82acab-ae84-48a7-83bd-7f83a96e3f7f\") " pod="openshift-marketplace/redhat-operators-bspnj" Dec 05 23:25:28 crc kubenswrapper[4734]: I1205 23:25:28.543141 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjg2k\" (UniqueName: \"kubernetes.io/projected/9e82acab-ae84-48a7-83bd-7f83a96e3f7f-kube-api-access-kjg2k\") pod \"redhat-operators-bspnj\" (UID: \"9e82acab-ae84-48a7-83bd-7f83a96e3f7f\") " pod="openshift-marketplace/redhat-operators-bspnj" Dec 05 23:25:28 crc kubenswrapper[4734]: I1205 23:25:28.905458 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zvcrv"] Dec 05 23:25:29 crc kubenswrapper[4734]: I1205 23:25:29.186731 4734 generic.go:334] "Generic (PLEG): container finished" podID="03642c63-70ad-48c4-9fa1-c0e8d2d0d067" containerID="196b318f35144c491e2b77e4ab84b2c3acec3172167e715f54f37dcf36f0738c" exitCode=0 Dec 05 23:25:29 crc kubenswrapper[4734]: I1205 23:25:29.186792 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvcrv" event={"ID":"03642c63-70ad-48c4-9fa1-c0e8d2d0d067","Type":"ContainerDied","Data":"196b318f35144c491e2b77e4ab84b2c3acec3172167e715f54f37dcf36f0738c"} Dec 05 23:25:29 crc kubenswrapper[4734]: I1205 23:25:29.187193 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvcrv" event={"ID":"03642c63-70ad-48c4-9fa1-c0e8d2d0d067","Type":"ContainerStarted","Data":"b18c24ad5df5c8d9ee39944b1731eac5da4c28f7e10c9240feef78ed07bdd247"} Dec 05 23:25:29 crc kubenswrapper[4734]: I1205 23:25:29.589405 4734 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-marketplace/redhat-operators-bspnj" secret="" err="failed to sync secret cache: timed out waiting for the condition" Dec 05 23:25:29 crc kubenswrapper[4734]: I1205 23:25:29.589563 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bspnj" Dec 05 23:25:29 crc kubenswrapper[4734]: I1205 23:25:29.740514 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.036185 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bspnj"] Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.197910 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvcrv" event={"ID":"03642c63-70ad-48c4-9fa1-c0e8d2d0d067","Type":"ContainerStarted","Data":"6a8f3097022e447597a4b4015ac8e55a7aec40848a066fb6b6900f8d24a73b35"} Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.199776 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bspnj" event={"ID":"9e82acab-ae84-48a7-83bd-7f83a96e3f7f","Type":"ContainerStarted","Data":"79cd680d8b553cc79807f067f9f8b1b4e5601c6b83f3fa7d9935461c57e0ee3d"} Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.466311 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rxk7p"] Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.467724 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rxk7p" Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.470397 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.477894 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rxk7p"] Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.639460 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw5sv\" (UniqueName: \"kubernetes.io/projected/1d3776cd-4682-4c8b-94e2-73bc8c1ee60e-kube-api-access-lw5sv\") pod \"community-operators-rxk7p\" (UID: \"1d3776cd-4682-4c8b-94e2-73bc8c1ee60e\") " pod="openshift-marketplace/community-operators-rxk7p" Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.639515 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d3776cd-4682-4c8b-94e2-73bc8c1ee60e-utilities\") pod \"community-operators-rxk7p\" (UID: \"1d3776cd-4682-4c8b-94e2-73bc8c1ee60e\") " pod="openshift-marketplace/community-operators-rxk7p" Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.639590 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d3776cd-4682-4c8b-94e2-73bc8c1ee60e-catalog-content\") pod \"community-operators-rxk7p\" (UID: \"1d3776cd-4682-4c8b-94e2-73bc8c1ee60e\") " pod="openshift-marketplace/community-operators-rxk7p" Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.664810 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fxgbf"] Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.665953 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fxgbf" Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.668241 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.680823 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxgbf"] Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.740398 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw5sv\" (UniqueName: \"kubernetes.io/projected/1d3776cd-4682-4c8b-94e2-73bc8c1ee60e-kube-api-access-lw5sv\") pod \"community-operators-rxk7p\" (UID: \"1d3776cd-4682-4c8b-94e2-73bc8c1ee60e\") " pod="openshift-marketplace/community-operators-rxk7p" Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.740441 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d3776cd-4682-4c8b-94e2-73bc8c1ee60e-utilities\") pod \"community-operators-rxk7p\" (UID: \"1d3776cd-4682-4c8b-94e2-73bc8c1ee60e\") " pod="openshift-marketplace/community-operators-rxk7p" Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.740469 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d3776cd-4682-4c8b-94e2-73bc8c1ee60e-catalog-content\") pod \"community-operators-rxk7p\" (UID: \"1d3776cd-4682-4c8b-94e2-73bc8c1ee60e\") " pod="openshift-marketplace/community-operators-rxk7p" Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.740911 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d3776cd-4682-4c8b-94e2-73bc8c1ee60e-utilities\") pod \"community-operators-rxk7p\" (UID: \"1d3776cd-4682-4c8b-94e2-73bc8c1ee60e\") " pod="openshift-marketplace/community-operators-rxk7p" Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.741230 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d3776cd-4682-4c8b-94e2-73bc8c1ee60e-catalog-content\") pod \"community-operators-rxk7p\" (UID: \"1d3776cd-4682-4c8b-94e2-73bc8c1ee60e\") " pod="openshift-marketplace/community-operators-rxk7p" Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.776141 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw5sv\" (UniqueName: \"kubernetes.io/projected/1d3776cd-4682-4c8b-94e2-73bc8c1ee60e-kube-api-access-lw5sv\") pod \"community-operators-rxk7p\" (UID: \"1d3776cd-4682-4c8b-94e2-73bc8c1ee60e\") " pod="openshift-marketplace/community-operators-rxk7p" Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.785759 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rxk7p" Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.842885 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52d0416a-4b26-4c76-8296-f279ad8c4158-catalog-content\") pod \"redhat-marketplace-fxgbf\" (UID: \"52d0416a-4b26-4c76-8296-f279ad8c4158\") " pod="openshift-marketplace/redhat-marketplace-fxgbf" Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.842988 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gh7v\" (UniqueName: \"kubernetes.io/projected/52d0416a-4b26-4c76-8296-f279ad8c4158-kube-api-access-4gh7v\") pod \"redhat-marketplace-fxgbf\" (UID: \"52d0416a-4b26-4c76-8296-f279ad8c4158\") " pod="openshift-marketplace/redhat-marketplace-fxgbf" Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.843028 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52d0416a-4b26-4c76-8296-f279ad8c4158-utilities\") pod \"redhat-marketplace-fxgbf\" (UID: \"52d0416a-4b26-4c76-8296-f279ad8c4158\") " pod="openshift-marketplace/redhat-marketplace-fxgbf" Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.944257 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52d0416a-4b26-4c76-8296-f279ad8c4158-catalog-content\") pod \"redhat-marketplace-fxgbf\" (UID: \"52d0416a-4b26-4c76-8296-f279ad8c4158\") " pod="openshift-marketplace/redhat-marketplace-fxgbf" Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.944372 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gh7v\" (UniqueName: \"kubernetes.io/projected/52d0416a-4b26-4c76-8296-f279ad8c4158-kube-api-access-4gh7v\") pod \"redhat-marketplace-fxgbf\" (UID: \"52d0416a-4b26-4c76-8296-f279ad8c4158\") " pod="openshift-marketplace/redhat-marketplace-fxgbf" Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.944421 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52d0416a-4b26-4c76-8296-f279ad8c4158-utilities\") pod \"redhat-marketplace-fxgbf\" (UID: \"52d0416a-4b26-4c76-8296-f279ad8c4158\") " pod="openshift-marketplace/redhat-marketplace-fxgbf" Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.945661 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52d0416a-4b26-4c76-8296-f279ad8c4158-catalog-content\") pod \"redhat-marketplace-fxgbf\" (UID: \"52d0416a-4b26-4c76-8296-f279ad8c4158\") " pod="openshift-marketplace/redhat-marketplace-fxgbf" Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.946054 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52d0416a-4b26-4c76-8296-f279ad8c4158-utilities\") pod \"redhat-marketplace-fxgbf\" (UID: \"52d0416a-4b26-4c76-8296-f279ad8c4158\") " pod="openshift-marketplace/redhat-marketplace-fxgbf" Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.970111 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gh7v\" (UniqueName: \"kubernetes.io/projected/52d0416a-4b26-4c76-8296-f279ad8c4158-kube-api-access-4gh7v\") pod \"redhat-marketplace-fxgbf\" (UID: \"52d0416a-4b26-4c76-8296-f279ad8c4158\") " pod="openshift-marketplace/redhat-marketplace-fxgbf" Dec 05 23:25:30 crc kubenswrapper[4734]: I1205 23:25:30.982932 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fxgbf" Dec 05 23:25:31 crc kubenswrapper[4734]: I1205 23:25:31.209455 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rxk7p"] Dec 05 23:25:31 crc kubenswrapper[4734]: I1205 23:25:31.212913 4734 generic.go:334] "Generic (PLEG): container finished" podID="03642c63-70ad-48c4-9fa1-c0e8d2d0d067" containerID="6a8f3097022e447597a4b4015ac8e55a7aec40848a066fb6b6900f8d24a73b35" exitCode=0 Dec 05 23:25:31 crc kubenswrapper[4734]: I1205 23:25:31.213001 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvcrv" event={"ID":"03642c63-70ad-48c4-9fa1-c0e8d2d0d067","Type":"ContainerDied","Data":"6a8f3097022e447597a4b4015ac8e55a7aec40848a066fb6b6900f8d24a73b35"} Dec 05 23:25:31 crc kubenswrapper[4734]: W1205 23:25:31.214031 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d3776cd_4682_4c8b_94e2_73bc8c1ee60e.slice/crio-8adbf4953e185bf3319bf2c742449a0ff8b7662dddbba412049c49ac1fa09e23 WatchSource:0}: Error finding container 8adbf4953e185bf3319bf2c742449a0ff8b7662dddbba412049c49ac1fa09e23: Status 404 returned error can't find the container with id 8adbf4953e185bf3319bf2c742449a0ff8b7662dddbba412049c49ac1fa09e23 Dec 05 23:25:31 crc kubenswrapper[4734]: I1205 23:25:31.216858 4734 generic.go:334] "Generic (PLEG): container finished" podID="9e82acab-ae84-48a7-83bd-7f83a96e3f7f" containerID="ec81132d882b53d828f3ee9d3e9a1c20f4050e1ca12bb6d71b7b20c6c3c1c0ba" exitCode=0 Dec 05 23:25:31 crc kubenswrapper[4734]: I1205 23:25:31.216919 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bspnj" event={"ID":"9e82acab-ae84-48a7-83bd-7f83a96e3f7f","Type":"ContainerDied","Data":"ec81132d882b53d828f3ee9d3e9a1c20f4050e1ca12bb6d71b7b20c6c3c1c0ba"} Dec 05 23:25:31 crc kubenswrapper[4734]: I1205 23:25:31.443972 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxgbf"] Dec 05 23:25:31 crc kubenswrapper[4734]: W1205 23:25:31.450741 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52d0416a_4b26_4c76_8296_f279ad8c4158.slice/crio-d0fd0a5e5723dfa244b8370a16fa0c1d51878f340fb331aacaf14a54c34ac37c WatchSource:0}: Error finding container d0fd0a5e5723dfa244b8370a16fa0c1d51878f340fb331aacaf14a54c34ac37c: Status 404 returned error can't find the container with id d0fd0a5e5723dfa244b8370a16fa0c1d51878f340fb331aacaf14a54c34ac37c Dec 05 23:25:32 crc kubenswrapper[4734]: I1205 23:25:32.225320 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bspnj" event={"ID":"9e82acab-ae84-48a7-83bd-7f83a96e3f7f","Type":"ContainerStarted","Data":"ab792d262b509f47e546ba97e45c133e73812d05361897badbb8031b080b0811"} Dec 05 23:25:32 crc kubenswrapper[4734]: I1205 23:25:32.227729 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxk7p" event={"ID":"1d3776cd-4682-4c8b-94e2-73bc8c1ee60e","Type":"ContainerDied","Data":"01b5095dc0492852be17ea26d1a45286a93a1be96fc1477c4de9c12f455cc063"} Dec 05 23:25:32 crc kubenswrapper[4734]: I1205 23:25:32.226262 4734 generic.go:334] "Generic (PLEG): container finished" podID="1d3776cd-4682-4c8b-94e2-73bc8c1ee60e" containerID="01b5095dc0492852be17ea26d1a45286a93a1be96fc1477c4de9c12f455cc063" exitCode=0 Dec 05 23:25:32 crc kubenswrapper[4734]: I1205 23:25:32.227825 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxk7p" event={"ID":"1d3776cd-4682-4c8b-94e2-73bc8c1ee60e","Type":"ContainerStarted","Data":"8adbf4953e185bf3319bf2c742449a0ff8b7662dddbba412049c49ac1fa09e23"} Dec 05 23:25:32 crc kubenswrapper[4734]: I1205 23:25:32.232949 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvcrv" event={"ID":"03642c63-70ad-48c4-9fa1-c0e8d2d0d067","Type":"ContainerStarted","Data":"7da43b1c82262c4b2ae46aa07cce8f4eef201eeca2200ba466247a2d9a9fdbc8"} Dec 05 23:25:32 crc kubenswrapper[4734]: I1205 23:25:32.236728 4734 generic.go:334] "Generic (PLEG): container finished" podID="52d0416a-4b26-4c76-8296-f279ad8c4158" containerID="9b25329c1b5072a8a1f7e48e8a94620e24637bb3a3377e179bea8a596b12e8eb" exitCode=0 Dec 05 23:25:32 crc kubenswrapper[4734]: I1205 23:25:32.236769 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxgbf" event={"ID":"52d0416a-4b26-4c76-8296-f279ad8c4158","Type":"ContainerDied","Data":"9b25329c1b5072a8a1f7e48e8a94620e24637bb3a3377e179bea8a596b12e8eb"} Dec 05 23:25:32 crc kubenswrapper[4734]: I1205 23:25:32.236791 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxgbf" event={"ID":"52d0416a-4b26-4c76-8296-f279ad8c4158","Type":"ContainerStarted","Data":"d0fd0a5e5723dfa244b8370a16fa0c1d51878f340fb331aacaf14a54c34ac37c"} Dec 05 23:25:32 crc kubenswrapper[4734]: I1205 23:25:32.318676 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zvcrv" podStartSLOduration=1.890808002 podStartE2EDuration="4.318653164s" podCreationTimestamp="2025-12-05 23:25:28 +0000 UTC" firstStartedPulling="2025-12-05 23:25:29.188761755 +0000 UTC m=+349.872166071" lastFinishedPulling="2025-12-05 23:25:31.616606957 +0000 UTC m=+352.300011233" observedRunningTime="2025-12-05 23:25:32.317807433 +0000 UTC m=+353.001211709" watchObservedRunningTime="2025-12-05 23:25:32.318653164 +0000 UTC m=+353.002057440" Dec 05 23:25:33 crc kubenswrapper[4734]: I1205 23:25:33.249366 4734 generic.go:334] "Generic (PLEG): container finished" podID="52d0416a-4b26-4c76-8296-f279ad8c4158" containerID="849d9cbcc174f58297e5fa28241f548b0a6b05c57db85a7499929b07852f8ad8" exitCode=0 Dec 05 23:25:33 crc kubenswrapper[4734]: I1205 23:25:33.250205 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxgbf" event={"ID":"52d0416a-4b26-4c76-8296-f279ad8c4158","Type":"ContainerDied","Data":"849d9cbcc174f58297e5fa28241f548b0a6b05c57db85a7499929b07852f8ad8"} Dec 05 23:25:33 crc kubenswrapper[4734]: I1205 23:25:33.252899 4734 generic.go:334] "Generic (PLEG): container finished" podID="9e82acab-ae84-48a7-83bd-7f83a96e3f7f" containerID="ab792d262b509f47e546ba97e45c133e73812d05361897badbb8031b080b0811" exitCode=0 Dec 05 23:25:33 crc kubenswrapper[4734]: I1205 23:25:33.252996 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bspnj" event={"ID":"9e82acab-ae84-48a7-83bd-7f83a96e3f7f","Type":"ContainerDied","Data":"ab792d262b509f47e546ba97e45c133e73812d05361897badbb8031b080b0811"} Dec 05 23:25:33 crc kubenswrapper[4734]: I1205 23:25:33.255891 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxk7p" event={"ID":"1d3776cd-4682-4c8b-94e2-73bc8c1ee60e","Type":"ContainerStarted","Data":"7d4feb78246841d45efb73e28eea90dea965be8f65a70e7116e8e097c6f0cb44"} Dec 05 23:25:34 crc kubenswrapper[4734]: I1205 23:25:34.265185 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bspnj" event={"ID":"9e82acab-ae84-48a7-83bd-7f83a96e3f7f","Type":"ContainerStarted","Data":"52aef0643e03519f96916a6ec0705acbda8813cbd1d7f24370082626269e1f0a"} Dec 05 23:25:34 crc kubenswrapper[4734]: I1205 23:25:34.267344 4734 generic.go:334] "Generic (PLEG): container finished" podID="1d3776cd-4682-4c8b-94e2-73bc8c1ee60e" containerID="7d4feb78246841d45efb73e28eea90dea965be8f65a70e7116e8e097c6f0cb44" exitCode=0 Dec 05 23:25:34 crc kubenswrapper[4734]: I1205 23:25:34.267846 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxk7p" event={"ID":"1d3776cd-4682-4c8b-94e2-73bc8c1ee60e","Type":"ContainerDied","Data":"7d4feb78246841d45efb73e28eea90dea965be8f65a70e7116e8e097c6f0cb44"} Dec 05 23:25:34 crc kubenswrapper[4734]: I1205 23:25:34.271914 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxgbf" event={"ID":"52d0416a-4b26-4c76-8296-f279ad8c4158","Type":"ContainerStarted","Data":"4744e8a0e620be5cf5ece63724fca8f3cf22208dc4beb38b6d41b81aca9bf3be"} Dec 05 23:25:34 crc kubenswrapper[4734]: I1205 23:25:34.324619 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bspnj" podStartSLOduration=3.834762394 podStartE2EDuration="6.32459414s" podCreationTimestamp="2025-12-05 23:25:28 +0000 UTC" firstStartedPulling="2025-12-05 23:25:31.21847088 +0000 UTC m=+351.901875146" lastFinishedPulling="2025-12-05 23:25:33.708302596 +0000 UTC m=+354.391706892" observedRunningTime="2025-12-05 23:25:34.29606164 +0000 UTC m=+354.979465956" watchObservedRunningTime="2025-12-05 23:25:34.32459414 +0000 UTC m=+355.007998416" Dec 05 23:25:34 crc kubenswrapper[4734]: I1205 23:25:34.326250 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fxgbf" podStartSLOduration=2.9051527569999998 podStartE2EDuration="4.326242971s" podCreationTimestamp="2025-12-05 23:25:30 +0000 UTC" firstStartedPulling="2025-12-05 23:25:32.238093312 +0000 UTC m=+352.921497588" lastFinishedPulling="2025-12-05 23:25:33.659183516 +0000 UTC m=+354.342587802" observedRunningTime="2025-12-05 23:25:34.322927478 +0000 UTC m=+355.006331814" watchObservedRunningTime="2025-12-05 23:25:34.326242971 +0000 UTC m=+355.009647247" Dec 05 23:25:35 crc kubenswrapper[4734]: I1205 23:25:35.281623 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxk7p" event={"ID":"1d3776cd-4682-4c8b-94e2-73bc8c1ee60e","Type":"ContainerStarted","Data":"a8df235c4a51057d287effc2bb4acd8f235beb89378e1253159f3e940b507da9"} Dec 05 23:25:35 crc kubenswrapper[4734]: I1205 23:25:35.300022 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rxk7p" podStartSLOduration=2.863957128 podStartE2EDuration="5.300001717s" podCreationTimestamp="2025-12-05 23:25:30 +0000 UTC" firstStartedPulling="2025-12-05 23:25:32.228665894 +0000 UTC m=+352.912070170" lastFinishedPulling="2025-12-05 23:25:34.664710473 +0000 UTC m=+355.348114759" observedRunningTime="2025-12-05 23:25:35.299146595 +0000 UTC m=+355.982550871" watchObservedRunningTime="2025-12-05 23:25:35.300001717 +0000 UTC m=+355.983405993" Dec 05 23:25:38 crc kubenswrapper[4734]: I1205 23:25:38.404974 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zvcrv" Dec 05 23:25:38 crc kubenswrapper[4734]: I1205 23:25:38.405548 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zvcrv" Dec 05 23:25:38 crc kubenswrapper[4734]: I1205 23:25:38.450409 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zvcrv" Dec 05 23:25:39 crc kubenswrapper[4734]: I1205 23:25:39.350998 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zvcrv" Dec 05 23:25:39 crc kubenswrapper[4734]: I1205 23:25:39.590637 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bspnj" Dec 05 23:25:39 crc kubenswrapper[4734]: I1205 23:25:39.590761 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bspnj" Dec 05 23:25:39 crc kubenswrapper[4734]: I1205 23:25:39.640429 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bspnj" Dec 05 23:25:40 crc kubenswrapper[4734]: I1205 23:25:40.363929 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bspnj" Dec 05 23:25:40 crc kubenswrapper[4734]: I1205 23:25:40.786797 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rxk7p" Dec 05 23:25:40 crc kubenswrapper[4734]: I1205 23:25:40.788161 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rxk7p" Dec 05 23:25:40 crc kubenswrapper[4734]: I1205 23:25:40.844559 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rxk7p" Dec 05 23:25:40 crc kubenswrapper[4734]: I1205 23:25:40.983989 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fxgbf" Dec 05 23:25:40 crc kubenswrapper[4734]: I1205 23:25:40.984054 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fxgbf" Dec 05 23:25:41 crc kubenswrapper[4734]: I1205 23:25:41.035209 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fxgbf" Dec 05 23:25:41 crc kubenswrapper[4734]: I1205 23:25:41.357185 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fxgbf" Dec 05 23:25:41 crc kubenswrapper[4734]: I1205 23:25:41.376622 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rxk7p" Dec 05 23:25:46 crc kubenswrapper[4734]: I1205 23:25:46.406289 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dtrsz"] Dec 05 23:25:46 crc kubenswrapper[4734]: I1205 23:25:46.407621 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" Dec 05 23:25:46 crc kubenswrapper[4734]: I1205 23:25:46.430305 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dtrsz"] Dec 05 23:25:46 crc kubenswrapper[4734]: I1205 23:25:46.508007 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fb82da8a-4fbf-4535-b975-e56a3615e08c-registry-tls\") pod \"image-registry-66df7c8f76-dtrsz\" (UID: \"fb82da8a-4fbf-4535-b975-e56a3615e08c\") " pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" Dec 05 23:25:46 crc kubenswrapper[4734]: I1205 23:25:46.508131 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fb82da8a-4fbf-4535-b975-e56a3615e08c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dtrsz\" (UID: \"fb82da8a-4fbf-4535-b975-e56a3615e08c\") " pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" Dec 05 23:25:46 crc kubenswrapper[4734]: I1205 23:25:46.508226 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-dtrsz\" (UID: \"fb82da8a-4fbf-4535-b975-e56a3615e08c\") " pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" Dec 05 23:25:46 crc kubenswrapper[4734]: I1205 23:25:46.508271 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb82da8a-4fbf-4535-b975-e56a3615e08c-trusted-ca\") pod \"image-registry-66df7c8f76-dtrsz\" (UID: \"fb82da8a-4fbf-4535-b975-e56a3615e08c\") " pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" Dec 05 23:25:46 crc kubenswrapper[4734]: I1205 23:25:46.508297 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k544x\" (UniqueName: \"kubernetes.io/projected/fb82da8a-4fbf-4535-b975-e56a3615e08c-kube-api-access-k544x\") pod \"image-registry-66df7c8f76-dtrsz\" (UID: \"fb82da8a-4fbf-4535-b975-e56a3615e08c\") " pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" Dec 05 23:25:46 crc kubenswrapper[4734]: I1205 23:25:46.508346 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fb82da8a-4fbf-4535-b975-e56a3615e08c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dtrsz\" (UID: \"fb82da8a-4fbf-4535-b975-e56a3615e08c\") " pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" Dec 05 23:25:46 crc kubenswrapper[4734]: I1205 23:25:46.508378 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fb82da8a-4fbf-4535-b975-e56a3615e08c-bound-sa-token\") pod \"image-registry-66df7c8f76-dtrsz\" (UID: \"fb82da8a-4fbf-4535-b975-e56a3615e08c\") " pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" Dec 05 23:25:46 crc kubenswrapper[4734]: I1205 23:25:46.508717 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fb82da8a-4fbf-4535-b975-e56a3615e08c-registry-certificates\") pod \"image-registry-66df7c8f76-dtrsz\" (UID: \"fb82da8a-4fbf-4535-b975-e56a3615e08c\") " pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" Dec 05 23:25:46 crc kubenswrapper[4734]: I1205 23:25:46.555212 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-dtrsz\" (UID: \"fb82da8a-4fbf-4535-b975-e56a3615e08c\") " pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" Dec 05 23:25:46 crc kubenswrapper[4734]: I1205 23:25:46.610258 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb82da8a-4fbf-4535-b975-e56a3615e08c-trusted-ca\") pod \"image-registry-66df7c8f76-dtrsz\" (UID: \"fb82da8a-4fbf-4535-b975-e56a3615e08c\") " pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" Dec 05 23:25:46 crc kubenswrapper[4734]: I1205 23:25:46.610668 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k544x\" (UniqueName: \"kubernetes.io/projected/fb82da8a-4fbf-4535-b975-e56a3615e08c-kube-api-access-k544x\") pod \"image-registry-66df7c8f76-dtrsz\" (UID: \"fb82da8a-4fbf-4535-b975-e56a3615e08c\") " pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" Dec 05 23:25:46 crc kubenswrapper[4734]: I1205 23:25:46.610710 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fb82da8a-4fbf-4535-b975-e56a3615e08c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dtrsz\" (UID: \"fb82da8a-4fbf-4535-b975-e56a3615e08c\") " pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" Dec 05 23:25:46 crc kubenswrapper[4734]: I1205 23:25:46.610751 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fb82da8a-4fbf-4535-b975-e56a3615e08c-bound-sa-token\") pod \"image-registry-66df7c8f76-dtrsz\" (UID: \"fb82da8a-4fbf-4535-b975-e56a3615e08c\") " pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" Dec 05 23:25:46 crc kubenswrapper[4734]: I1205 23:25:46.610774 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fb82da8a-4fbf-4535-b975-e56a3615e08c-registry-certificates\") pod \"image-registry-66df7c8f76-dtrsz\" (UID: \"fb82da8a-4fbf-4535-b975-e56a3615e08c\") " pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" Dec 05 23:25:46 crc kubenswrapper[4734]: I1205 23:25:46.610813 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fb82da8a-4fbf-4535-b975-e56a3615e08c-registry-tls\") pod \"image-registry-66df7c8f76-dtrsz\" (UID: \"fb82da8a-4fbf-4535-b975-e56a3615e08c\") " pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" Dec 05 23:25:46 crc kubenswrapper[4734]: I1205 23:25:46.610839 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fb82da8a-4fbf-4535-b975-e56a3615e08c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dtrsz\" (UID: \"fb82da8a-4fbf-4535-b975-e56a3615e08c\") " pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" Dec 05 23:25:46 crc kubenswrapper[4734]: I1205 23:25:46.611465 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fb82da8a-4fbf-4535-b975-e56a3615e08c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dtrsz\" (UID: \"fb82da8a-4fbf-4535-b975-e56a3615e08c\") " pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" Dec 05 23:25:46 crc kubenswrapper[4734]: I1205 23:25:46.612701 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fb82da8a-4fbf-4535-b975-e56a3615e08c-registry-certificates\") pod \"image-registry-66df7c8f76-dtrsz\" (UID: \"fb82da8a-4fbf-4535-b975-e56a3615e08c\") " pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" Dec 05 23:25:46 crc kubenswrapper[4734]: I1205 23:25:46.612729 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb82da8a-4fbf-4535-b975-e56a3615e08c-trusted-ca\") pod \"image-registry-66df7c8f76-dtrsz\" (UID: \"fb82da8a-4fbf-4535-b975-e56a3615e08c\") " pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" Dec 05 23:25:46 crc kubenswrapper[4734]: I1205 23:25:46.620051 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fb82da8a-4fbf-4535-b975-e56a3615e08c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dtrsz\" (UID: \"fb82da8a-4fbf-4535-b975-e56a3615e08c\") " pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" Dec 05 23:25:46 crc kubenswrapper[4734]: I1205 23:25:46.622905 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fb82da8a-4fbf-4535-b975-e56a3615e08c-registry-tls\") pod \"image-registry-66df7c8f76-dtrsz\" (UID: \"fb82da8a-4fbf-4535-b975-e56a3615e08c\") " pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" Dec 05 23:25:46 crc kubenswrapper[4734]: I1205 23:25:46.632216 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k544x\" (UniqueName: \"kubernetes.io/projected/fb82da8a-4fbf-4535-b975-e56a3615e08c-kube-api-access-k544x\") pod \"image-registry-66df7c8f76-dtrsz\" (UID: \"fb82da8a-4fbf-4535-b975-e56a3615e08c\") " pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" Dec 05 23:25:46 crc kubenswrapper[4734]: I1205 23:25:46.636407 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fb82da8a-4fbf-4535-b975-e56a3615e08c-bound-sa-token\") pod \"image-registry-66df7c8f76-dtrsz\" (UID: \"fb82da8a-4fbf-4535-b975-e56a3615e08c\") " pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" Dec 05 23:25:46 crc kubenswrapper[4734]: I1205 23:25:46.730085 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" Dec 05 23:25:47 crc kubenswrapper[4734]: I1205 23:25:47.218475 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dtrsz"] Dec 05 23:25:47 crc kubenswrapper[4734]: I1205 23:25:47.354119 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" event={"ID":"fb82da8a-4fbf-4535-b975-e56a3615e08c","Type":"ContainerStarted","Data":"0a7d750b036d40d93b94e6ba00f07c6c228c88b95a4d7a20e00df1b501bf79d6"} Dec 05 23:25:48 crc kubenswrapper[4734]: I1205 23:25:48.368313 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" event={"ID":"fb82da8a-4fbf-4535-b975-e56a3615e08c","Type":"ContainerStarted","Data":"021b44b27be112be0929833193452fc294af49514bf89d797f66894c7acc2161"} Dec 05 23:25:48 crc kubenswrapper[4734]: I1205 23:25:48.368760 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" Dec 05 23:25:50 crc kubenswrapper[4734]: I1205 23:25:50.444927 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:25:50 crc kubenswrapper[4734]: I1205 23:25:50.445472 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:26:06 crc kubenswrapper[4734]: I1205 23:26:06.736345 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" Dec 05 23:26:06 crc kubenswrapper[4734]: I1205 23:26:06.758961 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-dtrsz" podStartSLOduration=20.758933748 podStartE2EDuration="20.758933748s" podCreationTimestamp="2025-12-05 23:25:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:25:48.398376834 +0000 UTC m=+369.081781140" watchObservedRunningTime="2025-12-05 23:26:06.758933748 +0000 UTC m=+387.442338024" Dec 05 23:26:06 crc kubenswrapper[4734]: I1205 23:26:06.804259 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ptxqw"] Dec 05 23:26:20 crc kubenswrapper[4734]: I1205 23:26:20.445182 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:26:20 crc kubenswrapper[4734]: I1205 23:26:20.446113 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:26:20 crc kubenswrapper[4734]: I1205 23:26:20.446181 4734 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 05 23:26:20 crc kubenswrapper[4734]: I1205 23:26:20.446953 4734 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d4346c20725cce5df929f1d9a537d5302866dcd17b21ee10d0662364730d69a9"} pod="openshift-machine-config-operator/machine-config-daemon-vn94d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 23:26:20 crc kubenswrapper[4734]: I1205 23:26:20.447060 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" containerID="cri-o://d4346c20725cce5df929f1d9a537d5302866dcd17b21ee10d0662364730d69a9" gracePeriod=600 Dec 05 23:26:22 crc kubenswrapper[4734]: I1205 23:26:22.574654 4734 generic.go:334] "Generic (PLEG): container finished" podID="65758270-a7a7-46b5-af95-0588daf9fa86" containerID="d4346c20725cce5df929f1d9a537d5302866dcd17b21ee10d0662364730d69a9" exitCode=0 Dec 05 23:26:22 crc kubenswrapper[4734]: I1205 23:26:22.574727 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerDied","Data":"d4346c20725cce5df929f1d9a537d5302866dcd17b21ee10d0662364730d69a9"} Dec 05 23:26:22 crc kubenswrapper[4734]: I1205 23:26:22.575397 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerStarted","Data":"22bacdafd40b9938599de212c005778a6f3d95d2f7f54005c1b60a6e84bd1a7b"} Dec 05 23:26:22 crc kubenswrapper[4734]: I1205 23:26:22.575442 4734 scope.go:117] "RemoveContainer" containerID="2c0098a95c28de2d528d5dacf74969042d17d545bc6ee66496c46da61324ec18" Dec 05 23:26:31 crc kubenswrapper[4734]: I1205 23:26:31.898335 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" podUID="f4f948a0-bcd5-4e9e-86ec-0429082dac44" containerName="registry" containerID="cri-o://e38a8d45db59594dbd6149cd98567240633a7e01642234d90197625ff6c83768" gracePeriod=30 Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.282078 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.468791 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f4f948a0-bcd5-4e9e-86ec-0429082dac44-installation-pull-secrets\") pod \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.469835 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f4f948a0-bcd5-4e9e-86ec-0429082dac44-ca-trust-extracted\") pod \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.470124 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.470215 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4f948a0-bcd5-4e9e-86ec-0429082dac44-bound-sa-token\") pod \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.470312 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgtxb\" (UniqueName: \"kubernetes.io/projected/f4f948a0-bcd5-4e9e-86ec-0429082dac44-kube-api-access-mgtxb\") pod \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.470359 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4f948a0-bcd5-4e9e-86ec-0429082dac44-trusted-ca\") pod \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.470419 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f4f948a0-bcd5-4e9e-86ec-0429082dac44-registry-certificates\") pod \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.470459 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f4f948a0-bcd5-4e9e-86ec-0429082dac44-registry-tls\") pod \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\" (UID: \"f4f948a0-bcd5-4e9e-86ec-0429082dac44\") " Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.471271 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4f948a0-bcd5-4e9e-86ec-0429082dac44-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f4f948a0-bcd5-4e9e-86ec-0429082dac44" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.471437 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4f948a0-bcd5-4e9e-86ec-0429082dac44-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f4f948a0-bcd5-4e9e-86ec-0429082dac44" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.475663 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4f948a0-bcd5-4e9e-86ec-0429082dac44-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f4f948a0-bcd5-4e9e-86ec-0429082dac44" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.475939 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f948a0-bcd5-4e9e-86ec-0429082dac44-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f4f948a0-bcd5-4e9e-86ec-0429082dac44" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.480248 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f948a0-bcd5-4e9e-86ec-0429082dac44-kube-api-access-mgtxb" (OuterVolumeSpecName: "kube-api-access-mgtxb") pod "f4f948a0-bcd5-4e9e-86ec-0429082dac44" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44"). InnerVolumeSpecName "kube-api-access-mgtxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.482399 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "f4f948a0-bcd5-4e9e-86ec-0429082dac44" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.484197 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f948a0-bcd5-4e9e-86ec-0429082dac44-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f4f948a0-bcd5-4e9e-86ec-0429082dac44" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.490491 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4f948a0-bcd5-4e9e-86ec-0429082dac44-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f4f948a0-bcd5-4e9e-86ec-0429082dac44" (UID: "f4f948a0-bcd5-4e9e-86ec-0429082dac44"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.572724 4734 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4f948a0-bcd5-4e9e-86ec-0429082dac44-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.572824 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgtxb\" (UniqueName: \"kubernetes.io/projected/f4f948a0-bcd5-4e9e-86ec-0429082dac44-kube-api-access-mgtxb\") on node \"crc\" DevicePath \"\"" Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.572841 4734 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4f948a0-bcd5-4e9e-86ec-0429082dac44-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.572852 4734 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f4f948a0-bcd5-4e9e-86ec-0429082dac44-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.572862 4734 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f4f948a0-bcd5-4e9e-86ec-0429082dac44-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.572874 4734 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f4f948a0-bcd5-4e9e-86ec-0429082dac44-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.572883 4734 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f4f948a0-bcd5-4e9e-86ec-0429082dac44-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.648153 4734 generic.go:334] "Generic (PLEG): container finished" podID="f4f948a0-bcd5-4e9e-86ec-0429082dac44" containerID="e38a8d45db59594dbd6149cd98567240633a7e01642234d90197625ff6c83768" exitCode=0 Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.648237 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" event={"ID":"f4f948a0-bcd5-4e9e-86ec-0429082dac44","Type":"ContainerDied","Data":"e38a8d45db59594dbd6149cd98567240633a7e01642234d90197625ff6c83768"} Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.648279 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" event={"ID":"f4f948a0-bcd5-4e9e-86ec-0429082dac44","Type":"ContainerDied","Data":"f734123a9f2008a70e1b97f4c4481b3d5016d91517a8e6596f89a1cd984555aa"} Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.648303 4734 scope.go:117] "RemoveContainer" containerID="e38a8d45db59594dbd6149cd98567240633a7e01642234d90197625ff6c83768" Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.648462 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ptxqw" Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.675655 4734 scope.go:117] "RemoveContainer" containerID="e38a8d45db59594dbd6149cd98567240633a7e01642234d90197625ff6c83768" Dec 05 23:26:32 crc kubenswrapper[4734]: E1205 23:26:32.676249 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e38a8d45db59594dbd6149cd98567240633a7e01642234d90197625ff6c83768\": container with ID starting with e38a8d45db59594dbd6149cd98567240633a7e01642234d90197625ff6c83768 not found: ID does not exist" containerID="e38a8d45db59594dbd6149cd98567240633a7e01642234d90197625ff6c83768" Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.676290 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e38a8d45db59594dbd6149cd98567240633a7e01642234d90197625ff6c83768"} err="failed to get container status \"e38a8d45db59594dbd6149cd98567240633a7e01642234d90197625ff6c83768\": rpc error: code = NotFound desc = could not find container \"e38a8d45db59594dbd6149cd98567240633a7e01642234d90197625ff6c83768\": container with ID starting with e38a8d45db59594dbd6149cd98567240633a7e01642234d90197625ff6c83768 not found: ID does not exist" Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.692786 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ptxqw"] Dec 05 23:26:32 crc kubenswrapper[4734]: I1205 23:26:32.697517 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ptxqw"] Dec 05 23:26:33 crc kubenswrapper[4734]: I1205 23:26:33.629520 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4f948a0-bcd5-4e9e-86ec-0429082dac44" path="/var/lib/kubelet/pods/f4f948a0-bcd5-4e9e-86ec-0429082dac44/volumes" Dec 05 23:28:50 crc kubenswrapper[4734]: I1205 23:28:50.445620 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:28:50 crc kubenswrapper[4734]: I1205 23:28:50.446286 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:29:20 crc kubenswrapper[4734]: I1205 23:29:20.445249 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:29:20 crc kubenswrapper[4734]: I1205 23:29:20.446128 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:29:50 crc kubenswrapper[4734]: I1205 23:29:50.445281 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:29:50 crc kubenswrapper[4734]: I1205 23:29:50.446064 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:29:50 crc kubenswrapper[4734]: I1205 23:29:50.446136 4734 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 05 23:29:50 crc kubenswrapper[4734]: I1205 23:29:50.447079 4734 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"22bacdafd40b9938599de212c005778a6f3d95d2f7f54005c1b60a6e84bd1a7b"} pod="openshift-machine-config-operator/machine-config-daemon-vn94d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 23:29:50 crc kubenswrapper[4734]: I1205 23:29:50.447185 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" containerID="cri-o://22bacdafd40b9938599de212c005778a6f3d95d2f7f54005c1b60a6e84bd1a7b" gracePeriod=600 Dec 05 23:29:51 crc kubenswrapper[4734]: I1205 23:29:51.051556 4734 generic.go:334] "Generic (PLEG): container finished" podID="65758270-a7a7-46b5-af95-0588daf9fa86" containerID="22bacdafd40b9938599de212c005778a6f3d95d2f7f54005c1b60a6e84bd1a7b" exitCode=0 Dec 05 23:29:51 crc kubenswrapper[4734]: I1205 23:29:51.051627 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerDied","Data":"22bacdafd40b9938599de212c005778a6f3d95d2f7f54005c1b60a6e84bd1a7b"} Dec 05 23:29:51 crc kubenswrapper[4734]: I1205 23:29:51.051907 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerStarted","Data":"8e69331b125e1151d942b08cb111e9d9d1598a8f70aacd7d59fba49b1cd48af6"} Dec 05 23:29:51 crc kubenswrapper[4734]: I1205 23:29:51.051937 4734 scope.go:117] "RemoveContainer" containerID="d4346c20725cce5df929f1d9a537d5302866dcd17b21ee10d0662364730d69a9" Dec 05 23:30:00 crc kubenswrapper[4734]: I1205 23:30:00.197308 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416290-pwzdw"] Dec 05 23:30:00 crc kubenswrapper[4734]: E1205 23:30:00.201972 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f948a0-bcd5-4e9e-86ec-0429082dac44" containerName="registry" Dec 05 23:30:00 crc kubenswrapper[4734]: I1205 23:30:00.202004 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f948a0-bcd5-4e9e-86ec-0429082dac44" containerName="registry" Dec 05 23:30:00 crc kubenswrapper[4734]: I1205 23:30:00.202234 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4f948a0-bcd5-4e9e-86ec-0429082dac44" containerName="registry" Dec 05 23:30:00 crc kubenswrapper[4734]: I1205 23:30:00.203014 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-pwzdw" Dec 05 23:30:00 crc kubenswrapper[4734]: I1205 23:30:00.206132 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 23:30:00 crc kubenswrapper[4734]: I1205 23:30:00.206233 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 23:30:00 crc kubenswrapper[4734]: I1205 23:30:00.206610 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54smz\" (UniqueName: \"kubernetes.io/projected/267d14c5-5f5d-424b-8e0c-a7f1bc88892a-kube-api-access-54smz\") pod \"collect-profiles-29416290-pwzdw\" (UID: \"267d14c5-5f5d-424b-8e0c-a7f1bc88892a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-pwzdw" Dec 05 23:30:00 crc kubenswrapper[4734]: I1205 23:30:00.206664 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/267d14c5-5f5d-424b-8e0c-a7f1bc88892a-secret-volume\") pod \"collect-profiles-29416290-pwzdw\" (UID: \"267d14c5-5f5d-424b-8e0c-a7f1bc88892a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-pwzdw" Dec 05 23:30:00 crc kubenswrapper[4734]: I1205 23:30:00.207182 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416290-pwzdw"] Dec 05 23:30:00 crc kubenswrapper[4734]: I1205 23:30:00.207691 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/267d14c5-5f5d-424b-8e0c-a7f1bc88892a-config-volume\") pod \"collect-profiles-29416290-pwzdw\" (UID: \"267d14c5-5f5d-424b-8e0c-a7f1bc88892a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-pwzdw" Dec 05 23:30:00 crc kubenswrapper[4734]: I1205 23:30:00.308037 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/267d14c5-5f5d-424b-8e0c-a7f1bc88892a-config-volume\") pod \"collect-profiles-29416290-pwzdw\" (UID: \"267d14c5-5f5d-424b-8e0c-a7f1bc88892a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-pwzdw" Dec 05 23:30:00 crc kubenswrapper[4734]: I1205 23:30:00.308464 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54smz\" (UniqueName: \"kubernetes.io/projected/267d14c5-5f5d-424b-8e0c-a7f1bc88892a-kube-api-access-54smz\") pod \"collect-profiles-29416290-pwzdw\" (UID: \"267d14c5-5f5d-424b-8e0c-a7f1bc88892a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-pwzdw" Dec 05 23:30:00 crc kubenswrapper[4734]: I1205 23:30:00.308503 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/267d14c5-5f5d-424b-8e0c-a7f1bc88892a-secret-volume\") pod \"collect-profiles-29416290-pwzdw\" (UID: \"267d14c5-5f5d-424b-8e0c-a7f1bc88892a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-pwzdw" Dec 05 23:30:00 crc kubenswrapper[4734]: I1205 23:30:00.310970 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/267d14c5-5f5d-424b-8e0c-a7f1bc88892a-config-volume\") pod \"collect-profiles-29416290-pwzdw\" (UID: \"267d14c5-5f5d-424b-8e0c-a7f1bc88892a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-pwzdw" Dec 05 23:30:00 crc kubenswrapper[4734]: I1205 23:30:00.315998 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/267d14c5-5f5d-424b-8e0c-a7f1bc88892a-secret-volume\") pod \"collect-profiles-29416290-pwzdw\" (UID: \"267d14c5-5f5d-424b-8e0c-a7f1bc88892a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-pwzdw" Dec 05 23:30:00 crc kubenswrapper[4734]: I1205 23:30:00.326337 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54smz\" (UniqueName: \"kubernetes.io/projected/267d14c5-5f5d-424b-8e0c-a7f1bc88892a-kube-api-access-54smz\") pod \"collect-profiles-29416290-pwzdw\" (UID: \"267d14c5-5f5d-424b-8e0c-a7f1bc88892a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-pwzdw" Dec 05 23:30:00 crc kubenswrapper[4734]: I1205 23:30:00.532107 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-pwzdw" Dec 05 23:30:00 crc kubenswrapper[4734]: I1205 23:30:00.988881 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416290-pwzdw"] Dec 05 23:30:01 crc kubenswrapper[4734]: I1205 23:30:01.137782 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-pwzdw" event={"ID":"267d14c5-5f5d-424b-8e0c-a7f1bc88892a","Type":"ContainerStarted","Data":"22ab8a3516c0f76f56c2665776e8062aac5ee91ace07ec854e33671f4354f137"} Dec 05 23:30:01 crc kubenswrapper[4734]: I1205 23:30:01.137866 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-pwzdw" event={"ID":"267d14c5-5f5d-424b-8e0c-a7f1bc88892a","Type":"ContainerStarted","Data":"b27055a4cbab88abda65b202d33895110f017cf787b2de28c547d9a0d15ed38c"} Dec 05 23:30:01 crc kubenswrapper[4734]: I1205 23:30:01.160551 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-pwzdw" podStartSLOduration=1.160511562 podStartE2EDuration="1.160511562s" podCreationTimestamp="2025-12-05 23:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:30:01.158412922 +0000 UTC m=+621.841817208" watchObservedRunningTime="2025-12-05 23:30:01.160511562 +0000 UTC m=+621.843915848" Dec 05 23:30:02 crc kubenswrapper[4734]: I1205 23:30:02.146942 4734 generic.go:334] "Generic (PLEG): container finished" podID="267d14c5-5f5d-424b-8e0c-a7f1bc88892a" containerID="22ab8a3516c0f76f56c2665776e8062aac5ee91ace07ec854e33671f4354f137" exitCode=0 Dec 05 23:30:02 crc kubenswrapper[4734]: I1205 23:30:02.147013 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-pwzdw" event={"ID":"267d14c5-5f5d-424b-8e0c-a7f1bc88892a","Type":"ContainerDied","Data":"22ab8a3516c0f76f56c2665776e8062aac5ee91ace07ec854e33671f4354f137"} Dec 05 23:30:03 crc kubenswrapper[4734]: I1205 23:30:03.386063 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-pwzdw" Dec 05 23:30:03 crc kubenswrapper[4734]: I1205 23:30:03.556650 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/267d14c5-5f5d-424b-8e0c-a7f1bc88892a-config-volume\") pod \"267d14c5-5f5d-424b-8e0c-a7f1bc88892a\" (UID: \"267d14c5-5f5d-424b-8e0c-a7f1bc88892a\") " Dec 05 23:30:03 crc kubenswrapper[4734]: I1205 23:30:03.556800 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54smz\" (UniqueName: \"kubernetes.io/projected/267d14c5-5f5d-424b-8e0c-a7f1bc88892a-kube-api-access-54smz\") pod \"267d14c5-5f5d-424b-8e0c-a7f1bc88892a\" (UID: \"267d14c5-5f5d-424b-8e0c-a7f1bc88892a\") " Dec 05 23:30:03 crc kubenswrapper[4734]: I1205 23:30:03.556850 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/267d14c5-5f5d-424b-8e0c-a7f1bc88892a-secret-volume\") pod \"267d14c5-5f5d-424b-8e0c-a7f1bc88892a\" (UID: \"267d14c5-5f5d-424b-8e0c-a7f1bc88892a\") " Dec 05 23:30:03 crc kubenswrapper[4734]: I1205 23:30:03.558056 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267d14c5-5f5d-424b-8e0c-a7f1bc88892a-config-volume" (OuterVolumeSpecName: "config-volume") pod "267d14c5-5f5d-424b-8e0c-a7f1bc88892a" (UID: "267d14c5-5f5d-424b-8e0c-a7f1bc88892a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:30:03 crc kubenswrapper[4734]: I1205 23:30:03.567231 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/267d14c5-5f5d-424b-8e0c-a7f1bc88892a-kube-api-access-54smz" (OuterVolumeSpecName: "kube-api-access-54smz") pod "267d14c5-5f5d-424b-8e0c-a7f1bc88892a" (UID: "267d14c5-5f5d-424b-8e0c-a7f1bc88892a"). InnerVolumeSpecName "kube-api-access-54smz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:30:03 crc kubenswrapper[4734]: I1205 23:30:03.567342 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/267d14c5-5f5d-424b-8e0c-a7f1bc88892a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "267d14c5-5f5d-424b-8e0c-a7f1bc88892a" (UID: "267d14c5-5f5d-424b-8e0c-a7f1bc88892a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:30:03 crc kubenswrapper[4734]: I1205 23:30:03.659748 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54smz\" (UniqueName: \"kubernetes.io/projected/267d14c5-5f5d-424b-8e0c-a7f1bc88892a-kube-api-access-54smz\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:03 crc kubenswrapper[4734]: I1205 23:30:03.659989 4734 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/267d14c5-5f5d-424b-8e0c-a7f1bc88892a-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:03 crc kubenswrapper[4734]: I1205 23:30:03.660046 4734 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/267d14c5-5f5d-424b-8e0c-a7f1bc88892a-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:04 crc kubenswrapper[4734]: I1205 23:30:04.166733 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-pwzdw" Dec 05 23:30:04 crc kubenswrapper[4734]: I1205 23:30:04.166491 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416290-pwzdw" event={"ID":"267d14c5-5f5d-424b-8e0c-a7f1bc88892a","Type":"ContainerDied","Data":"b27055a4cbab88abda65b202d33895110f017cf787b2de28c547d9a0d15ed38c"} Dec 05 23:30:04 crc kubenswrapper[4734]: I1205 23:30:04.168895 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b27055a4cbab88abda65b202d33895110f017cf787b2de28c547d9a0d15ed38c" Dec 05 23:30:24 crc kubenswrapper[4734]: I1205 23:30:24.838513 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-j92m9"] Dec 05 23:30:24 crc kubenswrapper[4734]: E1205 23:30:24.839637 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="267d14c5-5f5d-424b-8e0c-a7f1bc88892a" containerName="collect-profiles" Dec 05 23:30:24 crc kubenswrapper[4734]: I1205 23:30:24.839655 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="267d14c5-5f5d-424b-8e0c-a7f1bc88892a" containerName="collect-profiles" Dec 05 23:30:24 crc kubenswrapper[4734]: I1205 23:30:24.839790 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="267d14c5-5f5d-424b-8e0c-a7f1bc88892a" containerName="collect-profiles" Dec 05 23:30:24 crc kubenswrapper[4734]: I1205 23:30:24.840362 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-j92m9" Dec 05 23:30:24 crc kubenswrapper[4734]: I1205 23:30:24.842891 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-hnchl"] Dec 05 23:30:24 crc kubenswrapper[4734]: I1205 23:30:24.842938 4734 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-zkl6p" Dec 05 23:30:24 crc kubenswrapper[4734]: I1205 23:30:24.843040 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 05 23:30:24 crc kubenswrapper[4734]: I1205 23:30:24.843466 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 05 23:30:24 crc kubenswrapper[4734]: I1205 23:30:24.844177 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-hnchl" Dec 05 23:30:24 crc kubenswrapper[4734]: I1205 23:30:24.846412 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-j92m9"] Dec 05 23:30:24 crc kubenswrapper[4734]: I1205 23:30:24.847590 4734 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-hp956" Dec 05 23:30:24 crc kubenswrapper[4734]: I1205 23:30:24.868426 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-hnchl"] Dec 05 23:30:24 crc kubenswrapper[4734]: I1205 23:30:24.882971 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-pffbx"] Dec 05 23:30:24 crc kubenswrapper[4734]: I1205 23:30:24.884009 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-pffbx" Dec 05 23:30:24 crc kubenswrapper[4734]: I1205 23:30:24.886454 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt4dm\" (UniqueName: \"kubernetes.io/projected/77b0debe-a9d9-495d-baf3-e5ad3c05541a-kube-api-access-zt4dm\") pod \"cert-manager-5b446d88c5-hnchl\" (UID: \"77b0debe-a9d9-495d-baf3-e5ad3c05541a\") " pod="cert-manager/cert-manager-5b446d88c5-hnchl" Dec 05 23:30:24 crc kubenswrapper[4734]: I1205 23:30:24.886511 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb8vr\" (UniqueName: \"kubernetes.io/projected/4806cf35-7fd8-4044-8618-8e573c476375-kube-api-access-hb8vr\") pod \"cert-manager-cainjector-7f985d654d-j92m9\" (UID: \"4806cf35-7fd8-4044-8618-8e573c476375\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-j92m9" Dec 05 23:30:24 crc kubenswrapper[4734]: I1205 23:30:24.888915 4734 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-j976w" Dec 05 23:30:24 crc kubenswrapper[4734]: I1205 23:30:24.891671 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-pffbx"] Dec 05 23:30:24 crc kubenswrapper[4734]: I1205 23:30:24.987463 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt4dm\" (UniqueName: \"kubernetes.io/projected/77b0debe-a9d9-495d-baf3-e5ad3c05541a-kube-api-access-zt4dm\") pod \"cert-manager-5b446d88c5-hnchl\" (UID: \"77b0debe-a9d9-495d-baf3-e5ad3c05541a\") " pod="cert-manager/cert-manager-5b446d88c5-hnchl" Dec 05 23:30:24 crc kubenswrapper[4734]: I1205 23:30:24.987555 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qh84\" (UniqueName: \"kubernetes.io/projected/4aa5e323-62f7-491b-a47e-747b2d32cfc5-kube-api-access-9qh84\") pod \"cert-manager-webhook-5655c58dd6-pffbx\" (UID: \"4aa5e323-62f7-491b-a47e-747b2d32cfc5\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-pffbx" Dec 05 23:30:24 crc kubenswrapper[4734]: I1205 23:30:24.987604 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb8vr\" (UniqueName: \"kubernetes.io/projected/4806cf35-7fd8-4044-8618-8e573c476375-kube-api-access-hb8vr\") pod \"cert-manager-cainjector-7f985d654d-j92m9\" (UID: \"4806cf35-7fd8-4044-8618-8e573c476375\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-j92m9" Dec 05 23:30:25 crc kubenswrapper[4734]: I1205 23:30:25.009165 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt4dm\" (UniqueName: \"kubernetes.io/projected/77b0debe-a9d9-495d-baf3-e5ad3c05541a-kube-api-access-zt4dm\") pod \"cert-manager-5b446d88c5-hnchl\" (UID: \"77b0debe-a9d9-495d-baf3-e5ad3c05541a\") " pod="cert-manager/cert-manager-5b446d88c5-hnchl" Dec 05 23:30:25 crc kubenswrapper[4734]: I1205 23:30:25.009317 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb8vr\" (UniqueName: \"kubernetes.io/projected/4806cf35-7fd8-4044-8618-8e573c476375-kube-api-access-hb8vr\") pod \"cert-manager-cainjector-7f985d654d-j92m9\" (UID: \"4806cf35-7fd8-4044-8618-8e573c476375\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-j92m9" Dec 05 23:30:25 crc kubenswrapper[4734]: I1205 23:30:25.088927 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qh84\" (UniqueName: \"kubernetes.io/projected/4aa5e323-62f7-491b-a47e-747b2d32cfc5-kube-api-access-9qh84\") pod \"cert-manager-webhook-5655c58dd6-pffbx\" (UID: \"4aa5e323-62f7-491b-a47e-747b2d32cfc5\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-pffbx" Dec 05 23:30:25 crc kubenswrapper[4734]: I1205 23:30:25.104601 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qh84\" (UniqueName: \"kubernetes.io/projected/4aa5e323-62f7-491b-a47e-747b2d32cfc5-kube-api-access-9qh84\") pod \"cert-manager-webhook-5655c58dd6-pffbx\" (UID: \"4aa5e323-62f7-491b-a47e-747b2d32cfc5\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-pffbx" Dec 05 23:30:25 crc kubenswrapper[4734]: I1205 23:30:25.166915 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-j92m9" Dec 05 23:30:25 crc kubenswrapper[4734]: I1205 23:30:25.174875 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-hnchl" Dec 05 23:30:25 crc kubenswrapper[4734]: I1205 23:30:25.198132 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-pffbx" Dec 05 23:30:25 crc kubenswrapper[4734]: I1205 23:30:25.449643 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-j92m9"] Dec 05 23:30:25 crc kubenswrapper[4734]: I1205 23:30:25.467482 4734 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 23:30:25 crc kubenswrapper[4734]: I1205 23:30:25.473336 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-hnchl"] Dec 05 23:30:25 crc kubenswrapper[4734]: W1205 23:30:25.484578 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77b0debe_a9d9_495d_baf3_e5ad3c05541a.slice/crio-e45beff1c420511c9f9ee1bcfe946b9779dfa1e81068acb5949dc0630934e3d7 WatchSource:0}: Error finding container e45beff1c420511c9f9ee1bcfe946b9779dfa1e81068acb5949dc0630934e3d7: Status 404 returned error can't find the container with id e45beff1c420511c9f9ee1bcfe946b9779dfa1e81068acb5949dc0630934e3d7 Dec 05 23:30:25 crc kubenswrapper[4734]: I1205 23:30:25.512208 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-pffbx"] Dec 05 23:30:25 crc kubenswrapper[4734]: W1205 23:30:25.515846 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4aa5e323_62f7_491b_a47e_747b2d32cfc5.slice/crio-3cead715b9cfac4c8efe80b2ac9d358120933fe9502e9cab448f46eec2d9d6ff WatchSource:0}: Error finding container 3cead715b9cfac4c8efe80b2ac9d358120933fe9502e9cab448f46eec2d9d6ff: Status 404 returned error can't find the container with id 3cead715b9cfac4c8efe80b2ac9d358120933fe9502e9cab448f46eec2d9d6ff Dec 05 23:30:26 crc kubenswrapper[4734]: I1205 23:30:26.315669 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-hnchl" event={"ID":"77b0debe-a9d9-495d-baf3-e5ad3c05541a","Type":"ContainerStarted","Data":"e45beff1c420511c9f9ee1bcfe946b9779dfa1e81068acb5949dc0630934e3d7"} Dec 05 23:30:26 crc kubenswrapper[4734]: I1205 23:30:26.318072 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-j92m9" event={"ID":"4806cf35-7fd8-4044-8618-8e573c476375","Type":"ContainerStarted","Data":"c7182221a718120eb5563b47276db757de46b4d979e54e48a7323331c1404130"} Dec 05 23:30:26 crc kubenswrapper[4734]: I1205 23:30:26.319046 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-pffbx" event={"ID":"4aa5e323-62f7-491b-a47e-747b2d32cfc5","Type":"ContainerStarted","Data":"3cead715b9cfac4c8efe80b2ac9d358120933fe9502e9cab448f46eec2d9d6ff"} Dec 05 23:30:29 crc kubenswrapper[4734]: I1205 23:30:29.338682 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-pffbx" event={"ID":"4aa5e323-62f7-491b-a47e-747b2d32cfc5","Type":"ContainerStarted","Data":"4abf3ef81a6b731f41115519aae112e640b48d40dec5af00d0d63f1040035109"} Dec 05 23:30:29 crc kubenswrapper[4734]: I1205 23:30:29.339813 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-pffbx" Dec 05 23:30:29 crc kubenswrapper[4734]: I1205 23:30:29.341004 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-hnchl" event={"ID":"77b0debe-a9d9-495d-baf3-e5ad3c05541a","Type":"ContainerStarted","Data":"102f9004053541438640ed173264813a9d277e64f30f1b8ed81da2a3d294ace8"} Dec 05 23:30:29 crc kubenswrapper[4734]: I1205 23:30:29.342956 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-j92m9" event={"ID":"4806cf35-7fd8-4044-8618-8e573c476375","Type":"ContainerStarted","Data":"040c51747754dd34d3561f8e460292d5da6c94d7dd37caab9195e8e7c95c1487"} Dec 05 23:30:29 crc kubenswrapper[4734]: I1205 23:30:29.362754 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-pffbx" podStartSLOduration=2.207429923 podStartE2EDuration="5.362731169s" podCreationTimestamp="2025-12-05 23:30:24 +0000 UTC" firstStartedPulling="2025-12-05 23:30:25.518617891 +0000 UTC m=+646.202022167" lastFinishedPulling="2025-12-05 23:30:28.673919137 +0000 UTC m=+649.357323413" observedRunningTime="2025-12-05 23:30:29.356356694 +0000 UTC m=+650.039760980" watchObservedRunningTime="2025-12-05 23:30:29.362731169 +0000 UTC m=+650.046135445" Dec 05 23:30:29 crc kubenswrapper[4734]: I1205 23:30:29.373994 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-hnchl" podStartSLOduration=2.20229898 podStartE2EDuration="5.373963701s" podCreationTimestamp="2025-12-05 23:30:24 +0000 UTC" firstStartedPulling="2025-12-05 23:30:25.488687748 +0000 UTC m=+646.172092024" lastFinishedPulling="2025-12-05 23:30:28.660352439 +0000 UTC m=+649.343756745" observedRunningTime="2025-12-05 23:30:29.372600028 +0000 UTC m=+650.056004314" watchObservedRunningTime="2025-12-05 23:30:29.373963701 +0000 UTC m=+650.057367987" Dec 05 23:30:34 crc kubenswrapper[4734]: I1205 23:30:34.983026 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-j92m9" podStartSLOduration=7.769355208 podStartE2EDuration="10.983002634s" podCreationTimestamp="2025-12-05 23:30:24 +0000 UTC" firstStartedPulling="2025-12-05 23:30:25.46725214 +0000 UTC m=+646.150656416" lastFinishedPulling="2025-12-05 23:30:28.680899566 +0000 UTC m=+649.364303842" observedRunningTime="2025-12-05 23:30:29.402140442 +0000 UTC m=+650.085544718" watchObservedRunningTime="2025-12-05 23:30:34.983002634 +0000 UTC m=+655.666406910" Dec 05 23:30:34 crc kubenswrapper[4734]: I1205 23:30:34.983978 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8bfg7"] Dec 05 23:30:34 crc kubenswrapper[4734]: I1205 23:30:34.984396 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="ovn-controller" containerID="cri-o://d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a" gracePeriod=30 Dec 05 23:30:34 crc kubenswrapper[4734]: I1205 23:30:34.984444 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="nbdb" containerID="cri-o://c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e" gracePeriod=30 Dec 05 23:30:34 crc kubenswrapper[4734]: I1205 23:30:34.984562 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="northd" containerID="cri-o://6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4" gracePeriod=30 Dec 05 23:30:34 crc kubenswrapper[4734]: I1205 23:30:34.984597 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1" gracePeriod=30 Dec 05 23:30:34 crc kubenswrapper[4734]: I1205 23:30:34.984631 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="kube-rbac-proxy-node" containerID="cri-o://69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946" gracePeriod=30 Dec 05 23:30:34 crc kubenswrapper[4734]: I1205 23:30:34.984660 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="ovn-acl-logging" containerID="cri-o://bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7" gracePeriod=30 Dec 05 23:30:34 crc kubenswrapper[4734]: I1205 23:30:34.984852 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="sbdb" containerID="cri-o://de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb" gracePeriod=30 Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.013332 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="ovnkube-controller" containerID="cri-o://3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80" gracePeriod=30 Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.202241 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-pffbx" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.342185 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8bfg7_2927a376-2f69-4820-a222-b86f08ece55a/ovnkube-controller/3.log" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.346214 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8bfg7_2927a376-2f69-4820-a222-b86f08ece55a/ovn-acl-logging/0.log" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.347301 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8bfg7_2927a376-2f69-4820-a222-b86f08ece55a/ovn-controller/0.log" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.348119 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.387246 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6kmh_1d76dc4e-40f3-4457-9a99-16f9a8ca8081/kube-multus/2.log" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.387732 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6kmh_1d76dc4e-40f3-4457-9a99-16f9a8ca8081/kube-multus/1.log" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.387788 4734 generic.go:334] "Generic (PLEG): container finished" podID="1d76dc4e-40f3-4457-9a99-16f9a8ca8081" containerID="a8a1ca8b179a33db1ca18703b7ff293739d406b155da94b438e9d16f215c6bb4" exitCode=2 Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.387873 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d6kmh" event={"ID":"1d76dc4e-40f3-4457-9a99-16f9a8ca8081","Type":"ContainerDied","Data":"a8a1ca8b179a33db1ca18703b7ff293739d406b155da94b438e9d16f215c6bb4"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.387926 4734 scope.go:117] "RemoveContainer" containerID="8453d43131f407bdf61410dd38713b44aea86c8647825551f40b2c41552206e8" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.388599 4734 scope.go:117] "RemoveContainer" containerID="a8a1ca8b179a33db1ca18703b7ff293739d406b155da94b438e9d16f215c6bb4" Dec 05 23:30:35 crc kubenswrapper[4734]: E1205 23:30:35.388818 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-d6kmh_openshift-multus(1d76dc4e-40f3-4457-9a99-16f9a8ca8081)\"" pod="openshift-multus/multus-d6kmh" podUID="1d76dc4e-40f3-4457-9a99-16f9a8ca8081" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.397372 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8bfg7_2927a376-2f69-4820-a222-b86f08ece55a/ovnkube-controller/3.log" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.401235 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8bfg7_2927a376-2f69-4820-a222-b86f08ece55a/ovn-acl-logging/0.log" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.401850 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8bfg7_2927a376-2f69-4820-a222-b86f08ece55a/ovn-controller/0.log" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402328 4734 generic.go:334] "Generic (PLEG): container finished" podID="2927a376-2f69-4820-a222-b86f08ece55a" containerID="3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80" exitCode=0 Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402386 4734 generic.go:334] "Generic (PLEG): container finished" podID="2927a376-2f69-4820-a222-b86f08ece55a" containerID="de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb" exitCode=0 Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402401 4734 generic.go:334] "Generic (PLEG): container finished" podID="2927a376-2f69-4820-a222-b86f08ece55a" containerID="c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e" exitCode=0 Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402432 4734 generic.go:334] "Generic (PLEG): container finished" podID="2927a376-2f69-4820-a222-b86f08ece55a" containerID="6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4" exitCode=0 Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402443 4734 generic.go:334] "Generic (PLEG): container finished" podID="2927a376-2f69-4820-a222-b86f08ece55a" containerID="846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1" exitCode=0 Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402447 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402462 4734 generic.go:334] "Generic (PLEG): container finished" podID="2927a376-2f69-4820-a222-b86f08ece55a" containerID="69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946" exitCode=0 Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402476 4734 generic.go:334] "Generic (PLEG): container finished" podID="2927a376-2f69-4820-a222-b86f08ece55a" containerID="bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7" exitCode=143 Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402489 4734 generic.go:334] "Generic (PLEG): container finished" podID="2927a376-2f69-4820-a222-b86f08ece55a" containerID="d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a" exitCode=143 Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402431 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" event={"ID":"2927a376-2f69-4820-a222-b86f08ece55a","Type":"ContainerDied","Data":"3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402587 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" event={"ID":"2927a376-2f69-4820-a222-b86f08ece55a","Type":"ContainerDied","Data":"de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402615 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" event={"ID":"2927a376-2f69-4820-a222-b86f08ece55a","Type":"ContainerDied","Data":"c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402633 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" event={"ID":"2927a376-2f69-4820-a222-b86f08ece55a","Type":"ContainerDied","Data":"6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402652 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" event={"ID":"2927a376-2f69-4820-a222-b86f08ece55a","Type":"ContainerDied","Data":"846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402668 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" event={"ID":"2927a376-2f69-4820-a222-b86f08ece55a","Type":"ContainerDied","Data":"69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402691 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402710 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402722 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402732 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402741 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402750 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402758 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402766 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402775 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402783 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402795 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" event={"ID":"2927a376-2f69-4820-a222-b86f08ece55a","Type":"ContainerDied","Data":"bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402808 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402821 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402832 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402841 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402851 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402861 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402872 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402882 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402892 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402902 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402916 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" event={"ID":"2927a376-2f69-4820-a222-b86f08ece55a","Type":"ContainerDied","Data":"d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402933 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402944 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402954 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402965 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402974 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402983 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.402992 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.403001 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.403011 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.403020 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.403034 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8bfg7" event={"ID":"2927a376-2f69-4820-a222-b86f08ece55a","Type":"ContainerDied","Data":"ad55e3286c595ac94160c058992d98d313047da1faf163f9547e7e2c17cdfebc"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.403049 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.403060 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.403070 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.403080 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.403089 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.403099 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.403108 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.403119 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.403128 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.403138 4734 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6"} Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.404983 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2fpnb"] Dec 05 23:30:35 crc kubenswrapper[4734]: E1205 23:30:35.405337 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="ovnkube-controller" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.405496 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="ovnkube-controller" Dec 05 23:30:35 crc kubenswrapper[4734]: E1205 23:30:35.405515 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="ovnkube-controller" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.405854 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="ovnkube-controller" Dec 05 23:30:35 crc kubenswrapper[4734]: E1205 23:30:35.405870 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="ovn-controller" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.405881 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="ovn-controller" Dec 05 23:30:35 crc kubenswrapper[4734]: E1205 23:30:35.405899 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="ovn-acl-logging" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.405910 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="ovn-acl-logging" Dec 05 23:30:35 crc kubenswrapper[4734]: E1205 23:30:35.405929 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="ovnkube-controller" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.405939 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="ovnkube-controller" Dec 05 23:30:35 crc kubenswrapper[4734]: E1205 23:30:35.405956 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="kubecfg-setup" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.405966 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="kubecfg-setup" Dec 05 23:30:35 crc kubenswrapper[4734]: E1205 23:30:35.405981 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="kube-rbac-proxy-node" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.405991 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="kube-rbac-proxy-node" Dec 05 23:30:35 crc kubenswrapper[4734]: E1205 23:30:35.406001 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="sbdb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.406009 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="sbdb" Dec 05 23:30:35 crc kubenswrapper[4734]: E1205 23:30:35.406022 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="ovnkube-controller" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.406030 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="ovnkube-controller" Dec 05 23:30:35 crc kubenswrapper[4734]: E1205 23:30:35.406044 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="ovnkube-controller" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.406052 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="ovnkube-controller" Dec 05 23:30:35 crc kubenswrapper[4734]: E1205 23:30:35.406064 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="nbdb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.406073 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="nbdb" Dec 05 23:30:35 crc kubenswrapper[4734]: E1205 23:30:35.406085 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.406094 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 23:30:35 crc kubenswrapper[4734]: E1205 23:30:35.406102 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="northd" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.406110 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="northd" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.406255 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="ovn-controller" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.406267 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.406276 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="ovnkube-controller" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.406285 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="ovn-acl-logging" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.406295 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="ovnkube-controller" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.406304 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="kube-rbac-proxy-node" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.406316 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="ovnkube-controller" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.406326 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="nbdb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.406338 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="northd" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.406347 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="sbdb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.406598 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="ovnkube-controller" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.406612 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="2927a376-2f69-4820-a222-b86f08ece55a" containerName="ovnkube-controller" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.409284 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.429027 4734 scope.go:117] "RemoveContainer" containerID="3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.442340 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-systemd-units\") pod \"2927a376-2f69-4820-a222-b86f08ece55a\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.442385 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-run-systemd\") pod \"2927a376-2f69-4820-a222-b86f08ece55a\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.442410 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-node-log\") pod \"2927a376-2f69-4820-a222-b86f08ece55a\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.442430 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-cni-bin\") pod \"2927a376-2f69-4820-a222-b86f08ece55a\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.442464 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2927a376-2f69-4820-a222-b86f08ece55a-ovnkube-script-lib\") pod \"2927a376-2f69-4820-a222-b86f08ece55a\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.442491 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2927a376-2f69-4820-a222-b86f08ece55a-ovnkube-config\") pod \"2927a376-2f69-4820-a222-b86f08ece55a\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.442507 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-run-ovn-kubernetes\") pod \"2927a376-2f69-4820-a222-b86f08ece55a\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.442540 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-run-ovn\") pod \"2927a376-2f69-4820-a222-b86f08ece55a\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.442559 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-log-socket\") pod \"2927a376-2f69-4820-a222-b86f08ece55a\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.442510 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "2927a376-2f69-4820-a222-b86f08ece55a" (UID: "2927a376-2f69-4820-a222-b86f08ece55a"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.442581 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"2927a376-2f69-4820-a222-b86f08ece55a\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.442609 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2927a376-2f69-4820-a222-b86f08ece55a-ovn-node-metrics-cert\") pod \"2927a376-2f69-4820-a222-b86f08ece55a\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.442633 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-etc-openvswitch\") pod \"2927a376-2f69-4820-a222-b86f08ece55a\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.442833 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssw64\" (UniqueName: \"kubernetes.io/projected/2927a376-2f69-4820-a222-b86f08ece55a-kube-api-access-ssw64\") pod \"2927a376-2f69-4820-a222-b86f08ece55a\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.442852 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2927a376-2f69-4820-a222-b86f08ece55a-env-overrides\") pod \"2927a376-2f69-4820-a222-b86f08ece55a\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.442873 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-cni-netd\") pod \"2927a376-2f69-4820-a222-b86f08ece55a\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.442898 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-kubelet\") pod \"2927a376-2f69-4820-a222-b86f08ece55a\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.442923 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-run-netns\") pod \"2927a376-2f69-4820-a222-b86f08ece55a\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.442942 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-var-lib-openvswitch\") pod \"2927a376-2f69-4820-a222-b86f08ece55a\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.442965 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-slash\") pod \"2927a376-2f69-4820-a222-b86f08ece55a\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.442981 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-run-openvswitch\") pod \"2927a376-2f69-4820-a222-b86f08ece55a\" (UID: \"2927a376-2f69-4820-a222-b86f08ece55a\") " Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443043 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ab6a2d7a-1ed3-4fac-b273-283a11937962-ovnkube-config\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443075 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-host-slash\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443098 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-systemd-units\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443114 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-host-run-ovn-kubernetes\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443164 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ab6a2d7a-1ed3-4fac-b273-283a11937962-ovnkube-script-lib\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443186 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-run-systemd\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443205 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-host-run-netns\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443221 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-etc-openvswitch\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443238 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-host-cni-netd\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443283 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-host-cni-bin\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443303 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-node-log\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443322 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-var-lib-openvswitch\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443341 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ab6a2d7a-1ed3-4fac-b273-283a11937962-env-overrides\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443367 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ab6a2d7a-1ed3-4fac-b273-283a11937962-ovn-node-metrics-cert\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443385 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-log-socket\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443403 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443428 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdcqx\" (UniqueName: \"kubernetes.io/projected/ab6a2d7a-1ed3-4fac-b273-283a11937962-kube-api-access-rdcqx\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443452 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-run-openvswitch\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443474 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-host-kubelet\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443492 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-run-ovn\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443542 4734 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443579 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2927a376-2f69-4820-a222-b86f08ece55a-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "2927a376-2f69-4820-a222-b86f08ece55a" (UID: "2927a376-2f69-4820-a222-b86f08ece55a"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443620 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "2927a376-2f69-4820-a222-b86f08ece55a" (UID: "2927a376-2f69-4820-a222-b86f08ece55a"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443653 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "2927a376-2f69-4820-a222-b86f08ece55a" (UID: "2927a376-2f69-4820-a222-b86f08ece55a"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443653 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "2927a376-2f69-4820-a222-b86f08ece55a" (UID: "2927a376-2f69-4820-a222-b86f08ece55a"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443681 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-log-socket" (OuterVolumeSpecName: "log-socket") pod "2927a376-2f69-4820-a222-b86f08ece55a" (UID: "2927a376-2f69-4820-a222-b86f08ece55a"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443707 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "2927a376-2f69-4820-a222-b86f08ece55a" (UID: "2927a376-2f69-4820-a222-b86f08ece55a"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443708 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-node-log" (OuterVolumeSpecName: "node-log") pod "2927a376-2f69-4820-a222-b86f08ece55a" (UID: "2927a376-2f69-4820-a222-b86f08ece55a"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.443775 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "2927a376-2f69-4820-a222-b86f08ece55a" (UID: "2927a376-2f69-4820-a222-b86f08ece55a"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.444235 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2927a376-2f69-4820-a222-b86f08ece55a-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "2927a376-2f69-4820-a222-b86f08ece55a" (UID: "2927a376-2f69-4820-a222-b86f08ece55a"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.444302 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "2927a376-2f69-4820-a222-b86f08ece55a" (UID: "2927a376-2f69-4820-a222-b86f08ece55a"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.444734 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2927a376-2f69-4820-a222-b86f08ece55a-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "2927a376-2f69-4820-a222-b86f08ece55a" (UID: "2927a376-2f69-4820-a222-b86f08ece55a"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.444782 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "2927a376-2f69-4820-a222-b86f08ece55a" (UID: "2927a376-2f69-4820-a222-b86f08ece55a"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.444827 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-slash" (OuterVolumeSpecName: "host-slash") pod "2927a376-2f69-4820-a222-b86f08ece55a" (UID: "2927a376-2f69-4820-a222-b86f08ece55a"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.444790 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "2927a376-2f69-4820-a222-b86f08ece55a" (UID: "2927a376-2f69-4820-a222-b86f08ece55a"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.444910 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "2927a376-2f69-4820-a222-b86f08ece55a" (UID: "2927a376-2f69-4820-a222-b86f08ece55a"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.444977 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "2927a376-2f69-4820-a222-b86f08ece55a" (UID: "2927a376-2f69-4820-a222-b86f08ece55a"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.456237 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2927a376-2f69-4820-a222-b86f08ece55a-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "2927a376-2f69-4820-a222-b86f08ece55a" (UID: "2927a376-2f69-4820-a222-b86f08ece55a"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.461199 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2927a376-2f69-4820-a222-b86f08ece55a-kube-api-access-ssw64" (OuterVolumeSpecName: "kube-api-access-ssw64") pod "2927a376-2f69-4820-a222-b86f08ece55a" (UID: "2927a376-2f69-4820-a222-b86f08ece55a"). InnerVolumeSpecName "kube-api-access-ssw64". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.462175 4734 scope.go:117] "RemoveContainer" containerID="9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.464792 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "2927a376-2f69-4820-a222-b86f08ece55a" (UID: "2927a376-2f69-4820-a222-b86f08ece55a"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.485573 4734 scope.go:117] "RemoveContainer" containerID="de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.500797 4734 scope.go:117] "RemoveContainer" containerID="c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.520898 4734 scope.go:117] "RemoveContainer" containerID="6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.539408 4734 scope.go:117] "RemoveContainer" containerID="846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.544395 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ab6a2d7a-1ed3-4fac-b273-283a11937962-ovn-node-metrics-cert\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.544453 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-log-socket\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.544662 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.544583 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-log-socket\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.544828 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.544881 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdcqx\" (UniqueName: \"kubernetes.io/projected/ab6a2d7a-1ed3-4fac-b273-283a11937962-kube-api-access-rdcqx\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.544978 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-run-openvswitch\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.545062 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-run-openvswitch\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.545205 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-host-kubelet\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.545144 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-host-kubelet\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.545301 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-run-ovn\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.545390 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-run-ovn\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.545468 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ab6a2d7a-1ed3-4fac-b273-283a11937962-ovnkube-config\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.545601 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-host-slash\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.545841 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-systemd-units\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.545871 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-host-run-ovn-kubernetes\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.545913 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ab6a2d7a-1ed3-4fac-b273-283a11937962-ovnkube-script-lib\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.545956 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-run-systemd\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.545992 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-host-run-netns\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.546026 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-etc-openvswitch\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.546058 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-host-cni-netd\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.546104 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-node-log\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.546135 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-host-cni-bin\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.546182 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-var-lib-openvswitch\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.546217 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ab6a2d7a-1ed3-4fac-b273-283a11937962-env-overrides\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.546290 4734 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2927a376-2f69-4820-a222-b86f08ece55a-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.546310 4734 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.546330 4734 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.546350 4734 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-log-socket\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.546369 4734 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.546389 4734 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2927a376-2f69-4820-a222-b86f08ece55a-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.546411 4734 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.546430 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssw64\" (UniqueName: \"kubernetes.io/projected/2927a376-2f69-4820-a222-b86f08ece55a-kube-api-access-ssw64\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.546448 4734 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2927a376-2f69-4820-a222-b86f08ece55a-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.546463 4734 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.546481 4734 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.546499 4734 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.546515 4734 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.547020 4734 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-slash\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.547041 4734 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.547592 4734 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.547616 4734 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-node-log\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.547677 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ab6a2d7a-1ed3-4fac-b273-283a11937962-ovnkube-config\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.547676 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ab6a2d7a-1ed3-4fac-b273-283a11937962-env-overrides\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.547727 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-host-run-ovn-kubernetes\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.547740 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-host-run-netns\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.547859 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-host-slash\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.547859 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-run-systemd\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.547909 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-host-cni-netd\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.547891 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-systemd-units\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.547892 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-node-log\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.547880 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-host-cni-bin\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.547913 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-etc-openvswitch\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.547959 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ab6a2d7a-1ed3-4fac-b273-283a11937962-var-lib-openvswitch\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.548009 4734 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2927a376-2f69-4820-a222-b86f08ece55a-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.548180 4734 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2927a376-2f69-4820-a222-b86f08ece55a-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.548711 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ab6a2d7a-1ed3-4fac-b273-283a11937962-ovnkube-script-lib\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.549038 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ab6a2d7a-1ed3-4fac-b273-283a11937962-ovn-node-metrics-cert\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.560009 4734 scope.go:117] "RemoveContainer" containerID="69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.569394 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdcqx\" (UniqueName: \"kubernetes.io/projected/ab6a2d7a-1ed3-4fac-b273-283a11937962-kube-api-access-rdcqx\") pod \"ovnkube-node-2fpnb\" (UID: \"ab6a2d7a-1ed3-4fac-b273-283a11937962\") " pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.577065 4734 scope.go:117] "RemoveContainer" containerID="bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.594422 4734 scope.go:117] "RemoveContainer" containerID="d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.612183 4734 scope.go:117] "RemoveContainer" containerID="bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.630778 4734 scope.go:117] "RemoveContainer" containerID="3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80" Dec 05 23:30:35 crc kubenswrapper[4734]: E1205 23:30:35.631399 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80\": container with ID starting with 3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80 not found: ID does not exist" containerID="3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.631498 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80"} err="failed to get container status \"3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80\": rpc error: code = NotFound desc = could not find container \"3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80\": container with ID starting with 3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80 not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.631626 4734 scope.go:117] "RemoveContainer" containerID="9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549" Dec 05 23:30:35 crc kubenswrapper[4734]: E1205 23:30:35.632322 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549\": container with ID starting with 9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549 not found: ID does not exist" containerID="9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.632383 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549"} err="failed to get container status \"9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549\": rpc error: code = NotFound desc = could not find container \"9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549\": container with ID starting with 9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549 not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.632431 4734 scope.go:117] "RemoveContainer" containerID="de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb" Dec 05 23:30:35 crc kubenswrapper[4734]: E1205 23:30:35.632776 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\": container with ID starting with de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb not found: ID does not exist" containerID="de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.632824 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb"} err="failed to get container status \"de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\": rpc error: code = NotFound desc = could not find container \"de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\": container with ID starting with de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.632852 4734 scope.go:117] "RemoveContainer" containerID="c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e" Dec 05 23:30:35 crc kubenswrapper[4734]: E1205 23:30:35.633416 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\": container with ID starting with c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e not found: ID does not exist" containerID="c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.633494 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e"} err="failed to get container status \"c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\": rpc error: code = NotFound desc = could not find container \"c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\": container with ID starting with c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.633521 4734 scope.go:117] "RemoveContainer" containerID="6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4" Dec 05 23:30:35 crc kubenswrapper[4734]: E1205 23:30:35.633966 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\": container with ID starting with 6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4 not found: ID does not exist" containerID="6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.634024 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4"} err="failed to get container status \"6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\": rpc error: code = NotFound desc = could not find container \"6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\": container with ID starting with 6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4 not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.634052 4734 scope.go:117] "RemoveContainer" containerID="846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1" Dec 05 23:30:35 crc kubenswrapper[4734]: E1205 23:30:35.634453 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\": container with ID starting with 846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1 not found: ID does not exist" containerID="846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.634507 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1"} err="failed to get container status \"846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\": rpc error: code = NotFound desc = could not find container \"846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\": container with ID starting with 846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1 not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.634593 4734 scope.go:117] "RemoveContainer" containerID="69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946" Dec 05 23:30:35 crc kubenswrapper[4734]: E1205 23:30:35.634923 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\": container with ID starting with 69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946 not found: ID does not exist" containerID="69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.634979 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946"} err="failed to get container status \"69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\": rpc error: code = NotFound desc = could not find container \"69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\": container with ID starting with 69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946 not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.635013 4734 scope.go:117] "RemoveContainer" containerID="bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7" Dec 05 23:30:35 crc kubenswrapper[4734]: E1205 23:30:35.635347 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\": container with ID starting with bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7 not found: ID does not exist" containerID="bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.635394 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7"} err="failed to get container status \"bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\": rpc error: code = NotFound desc = could not find container \"bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\": container with ID starting with bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7 not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.635420 4734 scope.go:117] "RemoveContainer" containerID="d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a" Dec 05 23:30:35 crc kubenswrapper[4734]: E1205 23:30:35.635819 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\": container with ID starting with d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a not found: ID does not exist" containerID="d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.635862 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a"} err="failed to get container status \"d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\": rpc error: code = NotFound desc = could not find container \"d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\": container with ID starting with d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.635889 4734 scope.go:117] "RemoveContainer" containerID="bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6" Dec 05 23:30:35 crc kubenswrapper[4734]: E1205 23:30:35.636269 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\": container with ID starting with bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6 not found: ID does not exist" containerID="bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.636313 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6"} err="failed to get container status \"bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\": rpc error: code = NotFound desc = could not find container \"bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\": container with ID starting with bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6 not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.636340 4734 scope.go:117] "RemoveContainer" containerID="3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.636774 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80"} err="failed to get container status \"3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80\": rpc error: code = NotFound desc = could not find container \"3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80\": container with ID starting with 3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80 not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.636811 4734 scope.go:117] "RemoveContainer" containerID="9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.637203 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549"} err="failed to get container status \"9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549\": rpc error: code = NotFound desc = could not find container \"9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549\": container with ID starting with 9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549 not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.637242 4734 scope.go:117] "RemoveContainer" containerID="de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.638176 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb"} err="failed to get container status \"de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\": rpc error: code = NotFound desc = could not find container \"de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\": container with ID starting with de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.638237 4734 scope.go:117] "RemoveContainer" containerID="c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.638644 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e"} err="failed to get container status \"c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\": rpc error: code = NotFound desc = could not find container \"c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\": container with ID starting with c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.638701 4734 scope.go:117] "RemoveContainer" containerID="6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.639077 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4"} err="failed to get container status \"6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\": rpc error: code = NotFound desc = could not find container \"6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\": container with ID starting with 6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4 not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.639117 4734 scope.go:117] "RemoveContainer" containerID="846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.639426 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1"} err="failed to get container status \"846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\": rpc error: code = NotFound desc = could not find container \"846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\": container with ID starting with 846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1 not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.639460 4734 scope.go:117] "RemoveContainer" containerID="69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.639838 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946"} err="failed to get container status \"69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\": rpc error: code = NotFound desc = could not find container \"69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\": container with ID starting with 69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946 not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.639892 4734 scope.go:117] "RemoveContainer" containerID="bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.640431 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7"} err="failed to get container status \"bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\": rpc error: code = NotFound desc = could not find container \"bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\": container with ID starting with bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7 not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.640476 4734 scope.go:117] "RemoveContainer" containerID="d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.640811 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a"} err="failed to get container status \"d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\": rpc error: code = NotFound desc = could not find container \"d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\": container with ID starting with d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.640859 4734 scope.go:117] "RemoveContainer" containerID="bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.641738 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6"} err="failed to get container status \"bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\": rpc error: code = NotFound desc = could not find container \"bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\": container with ID starting with bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6 not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.641786 4734 scope.go:117] "RemoveContainer" containerID="3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.642251 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80"} err="failed to get container status \"3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80\": rpc error: code = NotFound desc = could not find container \"3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80\": container with ID starting with 3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80 not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.642294 4734 scope.go:117] "RemoveContainer" containerID="9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.644777 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549"} err="failed to get container status \"9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549\": rpc error: code = NotFound desc = could not find container \"9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549\": container with ID starting with 9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549 not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.644859 4734 scope.go:117] "RemoveContainer" containerID="de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.646034 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb"} err="failed to get container status \"de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\": rpc error: code = NotFound desc = could not find container \"de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\": container with ID starting with de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.646131 4734 scope.go:117] "RemoveContainer" containerID="c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.646776 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e"} err="failed to get container status \"c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\": rpc error: code = NotFound desc = could not find container \"c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\": container with ID starting with c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.646843 4734 scope.go:117] "RemoveContainer" containerID="6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.647186 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4"} err="failed to get container status \"6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\": rpc error: code = NotFound desc = could not find container \"6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\": container with ID starting with 6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4 not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.647230 4734 scope.go:117] "RemoveContainer" containerID="846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.648054 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1"} err="failed to get container status \"846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\": rpc error: code = NotFound desc = could not find container \"846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\": container with ID starting with 846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1 not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.648195 4734 scope.go:117] "RemoveContainer" containerID="69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.649041 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946"} err="failed to get container status \"69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\": rpc error: code = NotFound desc = could not find container \"69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\": container with ID starting with 69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946 not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.649086 4734 scope.go:117] "RemoveContainer" containerID="bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.649469 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7"} err="failed to get container status \"bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\": rpc error: code = NotFound desc = could not find container \"bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\": container with ID starting with bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7 not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.649551 4734 scope.go:117] "RemoveContainer" containerID="d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.650119 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a"} err="failed to get container status \"d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\": rpc error: code = NotFound desc = could not find container \"d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\": container with ID starting with d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.650192 4734 scope.go:117] "RemoveContainer" containerID="bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.650598 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6"} err="failed to get container status \"bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\": rpc error: code = NotFound desc = could not find container \"bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\": container with ID starting with bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6 not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.650639 4734 scope.go:117] "RemoveContainer" containerID="3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.650939 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80"} err="failed to get container status \"3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80\": rpc error: code = NotFound desc = could not find container \"3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80\": container with ID starting with 3ce5cf13b483ebc55854fae2f3982a0784529abdb45de37fc8a07db02fa1fb80 not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.651016 4734 scope.go:117] "RemoveContainer" containerID="9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.651455 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549"} err="failed to get container status \"9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549\": rpc error: code = NotFound desc = could not find container \"9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549\": container with ID starting with 9491ba7b92932339f3ef1b9532d4ee5e33025995b6795edd6b9f0a6ab24ef549 not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.651578 4734 scope.go:117] "RemoveContainer" containerID="de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.652005 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb"} err="failed to get container status \"de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\": rpc error: code = NotFound desc = could not find container \"de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb\": container with ID starting with de154c86aec3ae6bc3181178ccb74f0cacaf5f5e795c69ff06116335d8cadadb not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.652042 4734 scope.go:117] "RemoveContainer" containerID="c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.652475 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e"} err="failed to get container status \"c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\": rpc error: code = NotFound desc = could not find container \"c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e\": container with ID starting with c4d62f1adfda28afd8c1172fb1dcc6dab52112e850abc9a0c6bb65f3ba210b8e not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.652585 4734 scope.go:117] "RemoveContainer" containerID="6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.652970 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4"} err="failed to get container status \"6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\": rpc error: code = NotFound desc = could not find container \"6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4\": container with ID starting with 6bd8191ffafc4c4cc45a1f4a0a7e3c173a7f3bfe893abcae405555825448c3b4 not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.653010 4734 scope.go:117] "RemoveContainer" containerID="846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.653374 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1"} err="failed to get container status \"846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\": rpc error: code = NotFound desc = could not find container \"846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1\": container with ID starting with 846f184fb34012c6331d3729bc1d8500148ff9195365ac187f60b6fa1011dfe1 not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.653467 4734 scope.go:117] "RemoveContainer" containerID="69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.654105 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946"} err="failed to get container status \"69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\": rpc error: code = NotFound desc = could not find container \"69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946\": container with ID starting with 69c0e762d7daefcb1129238d6a242930f01d6e498acc8d7fc505b425bcb29946 not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.654229 4734 scope.go:117] "RemoveContainer" containerID="bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.655284 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7"} err="failed to get container status \"bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\": rpc error: code = NotFound desc = could not find container \"bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7\": container with ID starting with bb39d43c81a2969d2be962f0c980c2b6607c07c2e82eaada6d42f9393fba92a7 not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.655333 4734 scope.go:117] "RemoveContainer" containerID="d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.655755 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a"} err="failed to get container status \"d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\": rpc error: code = NotFound desc = could not find container \"d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a\": container with ID starting with d9cae6c7782dd580e283cd72e4ab586fda2cd8a8de320a273de631f4b7d4f34a not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.655801 4734 scope.go:117] "RemoveContainer" containerID="bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.656189 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6"} err="failed to get container status \"bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\": rpc error: code = NotFound desc = could not find container \"bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6\": container with ID starting with bbf12f86a2baa0dbcd578f332ab13d747fb0853318b40f777832c697777b2cb6 not found: ID does not exist" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.733290 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.739142 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8bfg7"] Dec 05 23:30:35 crc kubenswrapper[4734]: I1205 23:30:35.746581 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8bfg7"] Dec 05 23:30:36 crc kubenswrapper[4734]: I1205 23:30:36.414886 4734 generic.go:334] "Generic (PLEG): container finished" podID="ab6a2d7a-1ed3-4fac-b273-283a11937962" containerID="de3b18bf27e10f6152a8a1ac16c780adc5fe0748139313a6a43cc3ef8b92be41" exitCode=0 Dec 05 23:30:36 crc kubenswrapper[4734]: I1205 23:30:36.415043 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" event={"ID":"ab6a2d7a-1ed3-4fac-b273-283a11937962","Type":"ContainerDied","Data":"de3b18bf27e10f6152a8a1ac16c780adc5fe0748139313a6a43cc3ef8b92be41"} Dec 05 23:30:36 crc kubenswrapper[4734]: I1205 23:30:36.415123 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" event={"ID":"ab6a2d7a-1ed3-4fac-b273-283a11937962","Type":"ContainerStarted","Data":"d90190307bc4c791804cebb9dc8ff3c84e401eb4dee39f73f780b0e69a4d2afd"} Dec 05 23:30:36 crc kubenswrapper[4734]: I1205 23:30:36.419492 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6kmh_1d76dc4e-40f3-4457-9a99-16f9a8ca8081/kube-multus/2.log" Dec 05 23:30:37 crc kubenswrapper[4734]: I1205 23:30:37.436162 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" event={"ID":"ab6a2d7a-1ed3-4fac-b273-283a11937962","Type":"ContainerStarted","Data":"b04120c28a6b3381cb9249335d2bdaf9fd60260ba00cc4a90b2371737ea95767"} Dec 05 23:30:37 crc kubenswrapper[4734]: I1205 23:30:37.436636 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" event={"ID":"ab6a2d7a-1ed3-4fac-b273-283a11937962","Type":"ContainerStarted","Data":"a6fa0753eb8aeb5ae58708805b78faa10eb380cfbcf4503f67c1d0cae768882c"} Dec 05 23:30:37 crc kubenswrapper[4734]: I1205 23:30:37.436651 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" event={"ID":"ab6a2d7a-1ed3-4fac-b273-283a11937962","Type":"ContainerStarted","Data":"a61b80f321c72e322bd9af43ea661f8530cbf576bd36c8364b93f7e3911a5623"} Dec 05 23:30:37 crc kubenswrapper[4734]: I1205 23:30:37.436662 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" event={"ID":"ab6a2d7a-1ed3-4fac-b273-283a11937962","Type":"ContainerStarted","Data":"ac636accec0fce47baab62e6cd20497f8977a7d472686013bef36e2c1b59e988"} Dec 05 23:30:37 crc kubenswrapper[4734]: I1205 23:30:37.436674 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" event={"ID":"ab6a2d7a-1ed3-4fac-b273-283a11937962","Type":"ContainerStarted","Data":"095aefbf3eb3c4871e964c0e7e240d45ece494ae48269d1460bc5a755bd63b83"} Dec 05 23:30:37 crc kubenswrapper[4734]: I1205 23:30:37.436683 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" event={"ID":"ab6a2d7a-1ed3-4fac-b273-283a11937962","Type":"ContainerStarted","Data":"941b6369da7d0b9c8462627738d28cef7e7ae3269ca3c7abd6fce25c8dff105c"} Dec 05 23:30:37 crc kubenswrapper[4734]: I1205 23:30:37.623195 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2927a376-2f69-4820-a222-b86f08ece55a" path="/var/lib/kubelet/pods/2927a376-2f69-4820-a222-b86f08ece55a/volumes" Dec 05 23:30:40 crc kubenswrapper[4734]: I1205 23:30:40.461297 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" event={"ID":"ab6a2d7a-1ed3-4fac-b273-283a11937962","Type":"ContainerStarted","Data":"a35ed7ce6a4a6734443b83cf0cb5f3ba208a7705aa576bc6af1960ea87560e32"} Dec 05 23:30:42 crc kubenswrapper[4734]: I1205 23:30:42.477393 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" event={"ID":"ab6a2d7a-1ed3-4fac-b273-283a11937962","Type":"ContainerStarted","Data":"e7f8d2f30ea240cded272f5e9a9631020d2d29d91d4d18bdc08610477dfe1d0e"} Dec 05 23:30:42 crc kubenswrapper[4734]: I1205 23:30:42.478378 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:42 crc kubenswrapper[4734]: I1205 23:30:42.478400 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:42 crc kubenswrapper[4734]: I1205 23:30:42.478415 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:42 crc kubenswrapper[4734]: I1205 23:30:42.508418 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:42 crc kubenswrapper[4734]: I1205 23:30:42.510934 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:30:42 crc kubenswrapper[4734]: I1205 23:30:42.524065 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" podStartSLOduration=7.524042919 podStartE2EDuration="7.524042919s" podCreationTimestamp="2025-12-05 23:30:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:30:42.521497817 +0000 UTC m=+663.204902093" watchObservedRunningTime="2025-12-05 23:30:42.524042919 +0000 UTC m=+663.207447205" Dec 05 23:30:46 crc kubenswrapper[4734]: I1205 23:30:46.615466 4734 scope.go:117] "RemoveContainer" containerID="a8a1ca8b179a33db1ca18703b7ff293739d406b155da94b438e9d16f215c6bb4" Dec 05 23:30:46 crc kubenswrapper[4734]: E1205 23:30:46.616186 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-d6kmh_openshift-multus(1d76dc4e-40f3-4457-9a99-16f9a8ca8081)\"" pod="openshift-multus/multus-d6kmh" podUID="1d76dc4e-40f3-4457-9a99-16f9a8ca8081" Dec 05 23:30:59 crc kubenswrapper[4734]: I1205 23:30:59.619683 4734 scope.go:117] "RemoveContainer" containerID="a8a1ca8b179a33db1ca18703b7ff293739d406b155da94b438e9d16f215c6bb4" Dec 05 23:31:00 crc kubenswrapper[4734]: I1205 23:31:00.612178 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d6kmh_1d76dc4e-40f3-4457-9a99-16f9a8ca8081/kube-multus/2.log" Dec 05 23:31:00 crc kubenswrapper[4734]: I1205 23:31:00.612686 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d6kmh" event={"ID":"1d76dc4e-40f3-4457-9a99-16f9a8ca8081","Type":"ContainerStarted","Data":"df0aa42779c16b4fc53e561b3f8cf65f0c65d9899b7744a1615ba20b4b55d9d9"} Dec 05 23:31:05 crc kubenswrapper[4734]: I1205 23:31:05.764706 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2fpnb" Dec 05 23:31:16 crc kubenswrapper[4734]: I1205 23:31:16.226968 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5"] Dec 05 23:31:16 crc kubenswrapper[4734]: I1205 23:31:16.228996 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5" Dec 05 23:31:16 crc kubenswrapper[4734]: I1205 23:31:16.231353 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 23:31:16 crc kubenswrapper[4734]: I1205 23:31:16.240869 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5"] Dec 05 23:31:16 crc kubenswrapper[4734]: I1205 23:31:16.318196 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44479d90-68b0-4428-b667-5c5c8bbebf2e-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5\" (UID: \"44479d90-68b0-4428-b667-5c5c8bbebf2e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5" Dec 05 23:31:16 crc kubenswrapper[4734]: I1205 23:31:16.318663 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44479d90-68b0-4428-b667-5c5c8bbebf2e-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5\" (UID: \"44479d90-68b0-4428-b667-5c5c8bbebf2e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5" Dec 05 23:31:16 crc kubenswrapper[4734]: I1205 23:31:16.318711 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4dp2\" (UniqueName: \"kubernetes.io/projected/44479d90-68b0-4428-b667-5c5c8bbebf2e-kube-api-access-v4dp2\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5\" (UID: \"44479d90-68b0-4428-b667-5c5c8bbebf2e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5" Dec 05 23:31:16 crc kubenswrapper[4734]: I1205 23:31:16.420011 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44479d90-68b0-4428-b667-5c5c8bbebf2e-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5\" (UID: \"44479d90-68b0-4428-b667-5c5c8bbebf2e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5" Dec 05 23:31:16 crc kubenswrapper[4734]: I1205 23:31:16.420095 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44479d90-68b0-4428-b667-5c5c8bbebf2e-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5\" (UID: \"44479d90-68b0-4428-b667-5c5c8bbebf2e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5" Dec 05 23:31:16 crc kubenswrapper[4734]: I1205 23:31:16.420139 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4dp2\" (UniqueName: \"kubernetes.io/projected/44479d90-68b0-4428-b667-5c5c8bbebf2e-kube-api-access-v4dp2\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5\" (UID: \"44479d90-68b0-4428-b667-5c5c8bbebf2e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5" Dec 05 23:31:16 crc kubenswrapper[4734]: I1205 23:31:16.420817 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44479d90-68b0-4428-b667-5c5c8bbebf2e-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5\" (UID: \"44479d90-68b0-4428-b667-5c5c8bbebf2e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5" Dec 05 23:31:16 crc kubenswrapper[4734]: I1205 23:31:16.420968 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44479d90-68b0-4428-b667-5c5c8bbebf2e-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5\" (UID: \"44479d90-68b0-4428-b667-5c5c8bbebf2e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5" Dec 05 23:31:16 crc kubenswrapper[4734]: I1205 23:31:16.458440 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4dp2\" (UniqueName: \"kubernetes.io/projected/44479d90-68b0-4428-b667-5c5c8bbebf2e-kube-api-access-v4dp2\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5\" (UID: \"44479d90-68b0-4428-b667-5c5c8bbebf2e\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5" Dec 05 23:31:16 crc kubenswrapper[4734]: I1205 23:31:16.552893 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5" Dec 05 23:31:16 crc kubenswrapper[4734]: I1205 23:31:16.769104 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5"] Dec 05 23:31:17 crc kubenswrapper[4734]: I1205 23:31:17.726766 4734 generic.go:334] "Generic (PLEG): container finished" podID="44479d90-68b0-4428-b667-5c5c8bbebf2e" containerID="95f07b2cff5ed6c8ab803839877db54e0a7b0e42bccc9f9498e62fcc29820527" exitCode=0 Dec 05 23:31:17 crc kubenswrapper[4734]: I1205 23:31:17.726826 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5" event={"ID":"44479d90-68b0-4428-b667-5c5c8bbebf2e","Type":"ContainerDied","Data":"95f07b2cff5ed6c8ab803839877db54e0a7b0e42bccc9f9498e62fcc29820527"} Dec 05 23:31:17 crc kubenswrapper[4734]: I1205 23:31:17.726863 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5" event={"ID":"44479d90-68b0-4428-b667-5c5c8bbebf2e","Type":"ContainerStarted","Data":"b08354fa387f28524f0489ea9b354803a5001670239a56cb78570f7a8a85ea41"} Dec 05 23:31:19 crc kubenswrapper[4734]: I1205 23:31:19.743080 4734 generic.go:334] "Generic (PLEG): container finished" podID="44479d90-68b0-4428-b667-5c5c8bbebf2e" containerID="07476c5738162f39dd67c8daf88bb3862531eb61b994c1a2c21e39ad4d1f37bd" exitCode=0 Dec 05 23:31:19 crc kubenswrapper[4734]: I1205 23:31:19.743172 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5" event={"ID":"44479d90-68b0-4428-b667-5c5c8bbebf2e","Type":"ContainerDied","Data":"07476c5738162f39dd67c8daf88bb3862531eb61b994c1a2c21e39ad4d1f37bd"} Dec 05 23:31:20 crc kubenswrapper[4734]: I1205 23:31:20.755128 4734 generic.go:334] "Generic (PLEG): container finished" podID="44479d90-68b0-4428-b667-5c5c8bbebf2e" containerID="4bb90c77d6455c9df7a1bff988bb3772fc133fa44c89dfb5e2057d5747fa3169" exitCode=0 Dec 05 23:31:20 crc kubenswrapper[4734]: I1205 23:31:20.755189 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5" event={"ID":"44479d90-68b0-4428-b667-5c5c8bbebf2e","Type":"ContainerDied","Data":"4bb90c77d6455c9df7a1bff988bb3772fc133fa44c89dfb5e2057d5747fa3169"} Dec 05 23:31:22 crc kubenswrapper[4734]: I1205 23:31:22.060056 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5" Dec 05 23:31:22 crc kubenswrapper[4734]: I1205 23:31:22.202997 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44479d90-68b0-4428-b667-5c5c8bbebf2e-bundle\") pod \"44479d90-68b0-4428-b667-5c5c8bbebf2e\" (UID: \"44479d90-68b0-4428-b667-5c5c8bbebf2e\") " Dec 05 23:31:22 crc kubenswrapper[4734]: I1205 23:31:22.203114 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44479d90-68b0-4428-b667-5c5c8bbebf2e-util\") pod \"44479d90-68b0-4428-b667-5c5c8bbebf2e\" (UID: \"44479d90-68b0-4428-b667-5c5c8bbebf2e\") " Dec 05 23:31:22 crc kubenswrapper[4734]: I1205 23:31:22.203203 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4dp2\" (UniqueName: \"kubernetes.io/projected/44479d90-68b0-4428-b667-5c5c8bbebf2e-kube-api-access-v4dp2\") pod \"44479d90-68b0-4428-b667-5c5c8bbebf2e\" (UID: \"44479d90-68b0-4428-b667-5c5c8bbebf2e\") " Dec 05 23:31:22 crc kubenswrapper[4734]: I1205 23:31:22.204570 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44479d90-68b0-4428-b667-5c5c8bbebf2e-bundle" (OuterVolumeSpecName: "bundle") pod "44479d90-68b0-4428-b667-5c5c8bbebf2e" (UID: "44479d90-68b0-4428-b667-5c5c8bbebf2e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:31:22 crc kubenswrapper[4734]: I1205 23:31:22.213758 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44479d90-68b0-4428-b667-5c5c8bbebf2e-kube-api-access-v4dp2" (OuterVolumeSpecName: "kube-api-access-v4dp2") pod "44479d90-68b0-4428-b667-5c5c8bbebf2e" (UID: "44479d90-68b0-4428-b667-5c5c8bbebf2e"). InnerVolumeSpecName "kube-api-access-v4dp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:31:22 crc kubenswrapper[4734]: I1205 23:31:22.223226 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44479d90-68b0-4428-b667-5c5c8bbebf2e-util" (OuterVolumeSpecName: "util") pod "44479d90-68b0-4428-b667-5c5c8bbebf2e" (UID: "44479d90-68b0-4428-b667-5c5c8bbebf2e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:31:22 crc kubenswrapper[4734]: I1205 23:31:22.305208 4734 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44479d90-68b0-4428-b667-5c5c8bbebf2e-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:31:22 crc kubenswrapper[4734]: I1205 23:31:22.305266 4734 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44479d90-68b0-4428-b667-5c5c8bbebf2e-util\") on node \"crc\" DevicePath \"\"" Dec 05 23:31:22 crc kubenswrapper[4734]: I1205 23:31:22.305286 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4dp2\" (UniqueName: \"kubernetes.io/projected/44479d90-68b0-4428-b667-5c5c8bbebf2e-kube-api-access-v4dp2\") on node \"crc\" DevicePath \"\"" Dec 05 23:31:22 crc kubenswrapper[4734]: I1205 23:31:22.774298 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5" event={"ID":"44479d90-68b0-4428-b667-5c5c8bbebf2e","Type":"ContainerDied","Data":"b08354fa387f28524f0489ea9b354803a5001670239a56cb78570f7a8a85ea41"} Dec 05 23:31:22 crc kubenswrapper[4734]: I1205 23:31:22.774374 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5" Dec 05 23:31:22 crc kubenswrapper[4734]: I1205 23:31:22.774384 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b08354fa387f28524f0489ea9b354803a5001670239a56cb78570f7a8a85ea41" Dec 05 23:31:27 crc kubenswrapper[4734]: I1205 23:31:27.607320 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-gxn6v"] Dec 05 23:31:27 crc kubenswrapper[4734]: E1205 23:31:27.608483 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44479d90-68b0-4428-b667-5c5c8bbebf2e" containerName="pull" Dec 05 23:31:27 crc kubenswrapper[4734]: I1205 23:31:27.608503 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="44479d90-68b0-4428-b667-5c5c8bbebf2e" containerName="pull" Dec 05 23:31:27 crc kubenswrapper[4734]: E1205 23:31:27.608541 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44479d90-68b0-4428-b667-5c5c8bbebf2e" containerName="util" Dec 05 23:31:27 crc kubenswrapper[4734]: I1205 23:31:27.608548 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="44479d90-68b0-4428-b667-5c5c8bbebf2e" containerName="util" Dec 05 23:31:27 crc kubenswrapper[4734]: E1205 23:31:27.608564 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44479d90-68b0-4428-b667-5c5c8bbebf2e" containerName="extract" Dec 05 23:31:27 crc kubenswrapper[4734]: I1205 23:31:27.608571 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="44479d90-68b0-4428-b667-5c5c8bbebf2e" containerName="extract" Dec 05 23:31:27 crc kubenswrapper[4734]: I1205 23:31:27.608704 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="44479d90-68b0-4428-b667-5c5c8bbebf2e" containerName="extract" Dec 05 23:31:27 crc kubenswrapper[4734]: I1205 23:31:27.609273 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-gxn6v" Dec 05 23:31:27 crc kubenswrapper[4734]: I1205 23:31:27.612601 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 05 23:31:27 crc kubenswrapper[4734]: I1205 23:31:27.613008 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 05 23:31:27 crc kubenswrapper[4734]: I1205 23:31:27.613198 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-hsjnn" Dec 05 23:31:27 crc kubenswrapper[4734]: I1205 23:31:27.661235 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-gxn6v"] Dec 05 23:31:27 crc kubenswrapper[4734]: I1205 23:31:27.686636 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cld4l\" (UniqueName: \"kubernetes.io/projected/7b52ad0f-5f7e-4691-be39-ac2f121bb909-kube-api-access-cld4l\") pod \"nmstate-operator-5b5b58f5c8-gxn6v\" (UID: \"7b52ad0f-5f7e-4691-be39-ac2f121bb909\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-gxn6v" Dec 05 23:31:27 crc kubenswrapper[4734]: I1205 23:31:27.788915 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cld4l\" (UniqueName: \"kubernetes.io/projected/7b52ad0f-5f7e-4691-be39-ac2f121bb909-kube-api-access-cld4l\") pod \"nmstate-operator-5b5b58f5c8-gxn6v\" (UID: \"7b52ad0f-5f7e-4691-be39-ac2f121bb909\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-gxn6v" Dec 05 23:31:27 crc kubenswrapper[4734]: I1205 23:31:27.828224 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cld4l\" (UniqueName: \"kubernetes.io/projected/7b52ad0f-5f7e-4691-be39-ac2f121bb909-kube-api-access-cld4l\") pod \"nmstate-operator-5b5b58f5c8-gxn6v\" (UID: \"7b52ad0f-5f7e-4691-be39-ac2f121bb909\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-gxn6v" Dec 05 23:31:27 crc kubenswrapper[4734]: I1205 23:31:27.930311 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-gxn6v" Dec 05 23:31:28 crc kubenswrapper[4734]: I1205 23:31:28.152015 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-gxn6v"] Dec 05 23:31:28 crc kubenswrapper[4734]: I1205 23:31:28.807687 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-gxn6v" event={"ID":"7b52ad0f-5f7e-4691-be39-ac2f121bb909","Type":"ContainerStarted","Data":"0dc8ba1bd1aca80ecfcde09709687d33b95d2b4bc2c12b19f1e2ced4aecd7649"} Dec 05 23:31:30 crc kubenswrapper[4734]: I1205 23:31:30.826282 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-gxn6v" event={"ID":"7b52ad0f-5f7e-4691-be39-ac2f121bb909","Type":"ContainerStarted","Data":"f3fe641b3cdc275609b252c4f11e82bd5102242951ebc670678b3d811de42f6d"} Dec 05 23:31:30 crc kubenswrapper[4734]: I1205 23:31:30.849716 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-gxn6v" podStartSLOduration=1.744586733 podStartE2EDuration="3.849694844s" podCreationTimestamp="2025-12-05 23:31:27 +0000 UTC" firstStartedPulling="2025-12-05 23:31:28.160206903 +0000 UTC m=+708.843611179" lastFinishedPulling="2025-12-05 23:31:30.265315014 +0000 UTC m=+710.948719290" observedRunningTime="2025-12-05 23:31:30.846926137 +0000 UTC m=+711.530330423" watchObservedRunningTime="2025-12-05 23:31:30.849694844 +0000 UTC m=+711.533099120" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.556085 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-6p9xx"] Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.557866 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6p9xx" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.560097 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-4cjlv" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.568612 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-zxqpc"] Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.569555 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-zxqpc" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.573725 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.587182 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-x728q"] Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.588216 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-x728q" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.594988 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-zxqpc"] Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.641462 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-6p9xx"] Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.711919 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cr2t4"] Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.712770 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cr2t4" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.715028 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.716642 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-8r747" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.716855 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.718585 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ab1d36f1-0fc8-4ad6-8725-799c1838b033-dbus-socket\") pod \"nmstate-handler-x728q\" (UID: \"ab1d36f1-0fc8-4ad6-8725-799c1838b033\") " pod="openshift-nmstate/nmstate-handler-x728q" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.718631 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ab1d36f1-0fc8-4ad6-8725-799c1838b033-nmstate-lock\") pod \"nmstate-handler-x728q\" (UID: \"ab1d36f1-0fc8-4ad6-8725-799c1838b033\") " pod="openshift-nmstate/nmstate-handler-x728q" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.718686 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shwhn\" (UniqueName: \"kubernetes.io/projected/ab1d36f1-0fc8-4ad6-8725-799c1838b033-kube-api-access-shwhn\") pod \"nmstate-handler-x728q\" (UID: \"ab1d36f1-0fc8-4ad6-8725-799c1838b033\") " pod="openshift-nmstate/nmstate-handler-x728q" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.718717 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6bf99a15-c582-4a10-a26f-252c1c870f55-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-zxqpc\" (UID: \"6bf99a15-c582-4a10-a26f-252c1c870f55\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-zxqpc" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.718746 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdlhf\" (UniqueName: \"kubernetes.io/projected/408e85a8-5bd9-4c30-bd55-5262e3a2aa24-kube-api-access-qdlhf\") pod \"nmstate-metrics-7f946cbc9-6p9xx\" (UID: \"408e85a8-5bd9-4c30-bd55-5262e3a2aa24\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6p9xx" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.718765 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swxtg\" (UniqueName: \"kubernetes.io/projected/6bf99a15-c582-4a10-a26f-252c1c870f55-kube-api-access-swxtg\") pod \"nmstate-webhook-5f6d4c5ccb-zxqpc\" (UID: \"6bf99a15-c582-4a10-a26f-252c1c870f55\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-zxqpc" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.719428 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ab1d36f1-0fc8-4ad6-8725-799c1838b033-ovs-socket\") pod \"nmstate-handler-x728q\" (UID: \"ab1d36f1-0fc8-4ad6-8725-799c1838b033\") " pod="openshift-nmstate/nmstate-handler-x728q" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.731318 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cr2t4"] Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.821128 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shwhn\" (UniqueName: \"kubernetes.io/projected/ab1d36f1-0fc8-4ad6-8725-799c1838b033-kube-api-access-shwhn\") pod \"nmstate-handler-x728q\" (UID: \"ab1d36f1-0fc8-4ad6-8725-799c1838b033\") " pod="openshift-nmstate/nmstate-handler-x728q" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.821205 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/238e4a30-5ad1-4948-b27f-41e096f3095a-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-cr2t4\" (UID: \"238e4a30-5ad1-4948-b27f-41e096f3095a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cr2t4" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.821236 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6bf99a15-c582-4a10-a26f-252c1c870f55-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-zxqpc\" (UID: \"6bf99a15-c582-4a10-a26f-252c1c870f55\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-zxqpc" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.821261 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdlhf\" (UniqueName: \"kubernetes.io/projected/408e85a8-5bd9-4c30-bd55-5262e3a2aa24-kube-api-access-qdlhf\") pod \"nmstate-metrics-7f946cbc9-6p9xx\" (UID: \"408e85a8-5bd9-4c30-bd55-5262e3a2aa24\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6p9xx" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.821281 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swxtg\" (UniqueName: \"kubernetes.io/projected/6bf99a15-c582-4a10-a26f-252c1c870f55-kube-api-access-swxtg\") pod \"nmstate-webhook-5f6d4c5ccb-zxqpc\" (UID: \"6bf99a15-c582-4a10-a26f-252c1c870f55\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-zxqpc" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.821309 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ab1d36f1-0fc8-4ad6-8725-799c1838b033-ovs-socket\") pod \"nmstate-handler-x728q\" (UID: \"ab1d36f1-0fc8-4ad6-8725-799c1838b033\") " pod="openshift-nmstate/nmstate-handler-x728q" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.821334 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsz8s\" (UniqueName: \"kubernetes.io/projected/238e4a30-5ad1-4948-b27f-41e096f3095a-kube-api-access-lsz8s\") pod \"nmstate-console-plugin-7fbb5f6569-cr2t4\" (UID: \"238e4a30-5ad1-4948-b27f-41e096f3095a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cr2t4" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.821360 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/238e4a30-5ad1-4948-b27f-41e096f3095a-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-cr2t4\" (UID: \"238e4a30-5ad1-4948-b27f-41e096f3095a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cr2t4" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.821388 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ab1d36f1-0fc8-4ad6-8725-799c1838b033-dbus-socket\") pod \"nmstate-handler-x728q\" (UID: \"ab1d36f1-0fc8-4ad6-8725-799c1838b033\") " pod="openshift-nmstate/nmstate-handler-x728q" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.821404 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ab1d36f1-0fc8-4ad6-8725-799c1838b033-nmstate-lock\") pod \"nmstate-handler-x728q\" (UID: \"ab1d36f1-0fc8-4ad6-8725-799c1838b033\") " pod="openshift-nmstate/nmstate-handler-x728q" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.821505 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ab1d36f1-0fc8-4ad6-8725-799c1838b033-nmstate-lock\") pod \"nmstate-handler-x728q\" (UID: \"ab1d36f1-0fc8-4ad6-8725-799c1838b033\") " pod="openshift-nmstate/nmstate-handler-x728q" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.821591 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ab1d36f1-0fc8-4ad6-8725-799c1838b033-ovs-socket\") pod \"nmstate-handler-x728q\" (UID: \"ab1d36f1-0fc8-4ad6-8725-799c1838b033\") " pod="openshift-nmstate/nmstate-handler-x728q" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.821968 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ab1d36f1-0fc8-4ad6-8725-799c1838b033-dbus-socket\") pod \"nmstate-handler-x728q\" (UID: \"ab1d36f1-0fc8-4ad6-8725-799c1838b033\") " pod="openshift-nmstate/nmstate-handler-x728q" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.829742 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6bf99a15-c582-4a10-a26f-252c1c870f55-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-zxqpc\" (UID: \"6bf99a15-c582-4a10-a26f-252c1c870f55\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-zxqpc" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.840049 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shwhn\" (UniqueName: \"kubernetes.io/projected/ab1d36f1-0fc8-4ad6-8725-799c1838b033-kube-api-access-shwhn\") pod \"nmstate-handler-x728q\" (UID: \"ab1d36f1-0fc8-4ad6-8725-799c1838b033\") " pod="openshift-nmstate/nmstate-handler-x728q" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.843735 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdlhf\" (UniqueName: \"kubernetes.io/projected/408e85a8-5bd9-4c30-bd55-5262e3a2aa24-kube-api-access-qdlhf\") pod \"nmstate-metrics-7f946cbc9-6p9xx\" (UID: \"408e85a8-5bd9-4c30-bd55-5262e3a2aa24\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6p9xx" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.847294 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swxtg\" (UniqueName: \"kubernetes.io/projected/6bf99a15-c582-4a10-a26f-252c1c870f55-kube-api-access-swxtg\") pod \"nmstate-webhook-5f6d4c5ccb-zxqpc\" (UID: \"6bf99a15-c582-4a10-a26f-252c1c870f55\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-zxqpc" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.882710 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6p9xx" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.894357 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-zxqpc" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.909207 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-x728q" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.922729 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/238e4a30-5ad1-4948-b27f-41e096f3095a-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-cr2t4\" (UID: \"238e4a30-5ad1-4948-b27f-41e096f3095a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cr2t4" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.922796 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsz8s\" (UniqueName: \"kubernetes.io/projected/238e4a30-5ad1-4948-b27f-41e096f3095a-kube-api-access-lsz8s\") pod \"nmstate-console-plugin-7fbb5f6569-cr2t4\" (UID: \"238e4a30-5ad1-4948-b27f-41e096f3095a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cr2t4" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.922829 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/238e4a30-5ad1-4948-b27f-41e096f3095a-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-cr2t4\" (UID: \"238e4a30-5ad1-4948-b27f-41e096f3095a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cr2t4" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.923883 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/238e4a30-5ad1-4948-b27f-41e096f3095a-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-cr2t4\" (UID: \"238e4a30-5ad1-4948-b27f-41e096f3095a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cr2t4" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.932539 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/238e4a30-5ad1-4948-b27f-41e096f3095a-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-cr2t4\" (UID: \"238e4a30-5ad1-4948-b27f-41e096f3095a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cr2t4" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.940973 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6f4549b8d9-j8tjt"] Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.941781 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f4549b8d9-j8tjt" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.959976 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsz8s\" (UniqueName: \"kubernetes.io/projected/238e4a30-5ad1-4948-b27f-41e096f3095a-kube-api-access-lsz8s\") pod \"nmstate-console-plugin-7fbb5f6569-cr2t4\" (UID: \"238e4a30-5ad1-4948-b27f-41e096f3095a\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cr2t4" Dec 05 23:31:36 crc kubenswrapper[4734]: I1205 23:31:36.972916 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f4549b8d9-j8tjt"] Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.025273 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zgk9\" (UniqueName: \"kubernetes.io/projected/7eb5a67c-c1a4-4294-bc90-b061efbffca9-kube-api-access-8zgk9\") pod \"console-6f4549b8d9-j8tjt\" (UID: \"7eb5a67c-c1a4-4294-bc90-b061efbffca9\") " pod="openshift-console/console-6f4549b8d9-j8tjt" Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.025772 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7eb5a67c-c1a4-4294-bc90-b061efbffca9-oauth-serving-cert\") pod \"console-6f4549b8d9-j8tjt\" (UID: \"7eb5a67c-c1a4-4294-bc90-b061efbffca9\") " pod="openshift-console/console-6f4549b8d9-j8tjt" Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.025836 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7eb5a67c-c1a4-4294-bc90-b061efbffca9-console-oauth-config\") pod \"console-6f4549b8d9-j8tjt\" (UID: \"7eb5a67c-c1a4-4294-bc90-b061efbffca9\") " pod="openshift-console/console-6f4549b8d9-j8tjt" Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.025871 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7eb5a67c-c1a4-4294-bc90-b061efbffca9-console-serving-cert\") pod \"console-6f4549b8d9-j8tjt\" (UID: \"7eb5a67c-c1a4-4294-bc90-b061efbffca9\") " pod="openshift-console/console-6f4549b8d9-j8tjt" Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.026191 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7eb5a67c-c1a4-4294-bc90-b061efbffca9-trusted-ca-bundle\") pod \"console-6f4549b8d9-j8tjt\" (UID: \"7eb5a67c-c1a4-4294-bc90-b061efbffca9\") " pod="openshift-console/console-6f4549b8d9-j8tjt" Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.026252 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7eb5a67c-c1a4-4294-bc90-b061efbffca9-service-ca\") pod \"console-6f4549b8d9-j8tjt\" (UID: \"7eb5a67c-c1a4-4294-bc90-b061efbffca9\") " pod="openshift-console/console-6f4549b8d9-j8tjt" Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.026452 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7eb5a67c-c1a4-4294-bc90-b061efbffca9-console-config\") pod \"console-6f4549b8d9-j8tjt\" (UID: \"7eb5a67c-c1a4-4294-bc90-b061efbffca9\") " pod="openshift-console/console-6f4549b8d9-j8tjt" Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.027024 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cr2t4" Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.127506 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7eb5a67c-c1a4-4294-bc90-b061efbffca9-console-serving-cert\") pod \"console-6f4549b8d9-j8tjt\" (UID: \"7eb5a67c-c1a4-4294-bc90-b061efbffca9\") " pod="openshift-console/console-6f4549b8d9-j8tjt" Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.127585 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7eb5a67c-c1a4-4294-bc90-b061efbffca9-trusted-ca-bundle\") pod \"console-6f4549b8d9-j8tjt\" (UID: \"7eb5a67c-c1a4-4294-bc90-b061efbffca9\") " pod="openshift-console/console-6f4549b8d9-j8tjt" Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.127608 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7eb5a67c-c1a4-4294-bc90-b061efbffca9-service-ca\") pod \"console-6f4549b8d9-j8tjt\" (UID: \"7eb5a67c-c1a4-4294-bc90-b061efbffca9\") " pod="openshift-console/console-6f4549b8d9-j8tjt" Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.127636 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7eb5a67c-c1a4-4294-bc90-b061efbffca9-console-config\") pod \"console-6f4549b8d9-j8tjt\" (UID: \"7eb5a67c-c1a4-4294-bc90-b061efbffca9\") " pod="openshift-console/console-6f4549b8d9-j8tjt" Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.127664 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zgk9\" (UniqueName: \"kubernetes.io/projected/7eb5a67c-c1a4-4294-bc90-b061efbffca9-kube-api-access-8zgk9\") pod \"console-6f4549b8d9-j8tjt\" (UID: \"7eb5a67c-c1a4-4294-bc90-b061efbffca9\") " pod="openshift-console/console-6f4549b8d9-j8tjt" Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.127684 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7eb5a67c-c1a4-4294-bc90-b061efbffca9-oauth-serving-cert\") pod \"console-6f4549b8d9-j8tjt\" (UID: \"7eb5a67c-c1a4-4294-bc90-b061efbffca9\") " pod="openshift-console/console-6f4549b8d9-j8tjt" Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.127723 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7eb5a67c-c1a4-4294-bc90-b061efbffca9-console-oauth-config\") pod \"console-6f4549b8d9-j8tjt\" (UID: \"7eb5a67c-c1a4-4294-bc90-b061efbffca9\") " pod="openshift-console/console-6f4549b8d9-j8tjt" Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.128806 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7eb5a67c-c1a4-4294-bc90-b061efbffca9-service-ca\") pod \"console-6f4549b8d9-j8tjt\" (UID: \"7eb5a67c-c1a4-4294-bc90-b061efbffca9\") " pod="openshift-console/console-6f4549b8d9-j8tjt" Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.129124 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7eb5a67c-c1a4-4294-bc90-b061efbffca9-console-config\") pod \"console-6f4549b8d9-j8tjt\" (UID: \"7eb5a67c-c1a4-4294-bc90-b061efbffca9\") " pod="openshift-console/console-6f4549b8d9-j8tjt" Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.129841 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7eb5a67c-c1a4-4294-bc90-b061efbffca9-oauth-serving-cert\") pod \"console-6f4549b8d9-j8tjt\" (UID: \"7eb5a67c-c1a4-4294-bc90-b061efbffca9\") " pod="openshift-console/console-6f4549b8d9-j8tjt" Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.130671 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7eb5a67c-c1a4-4294-bc90-b061efbffca9-trusted-ca-bundle\") pod \"console-6f4549b8d9-j8tjt\" (UID: \"7eb5a67c-c1a4-4294-bc90-b061efbffca9\") " pod="openshift-console/console-6f4549b8d9-j8tjt" Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.133782 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7eb5a67c-c1a4-4294-bc90-b061efbffca9-console-oauth-config\") pod \"console-6f4549b8d9-j8tjt\" (UID: \"7eb5a67c-c1a4-4294-bc90-b061efbffca9\") " pod="openshift-console/console-6f4549b8d9-j8tjt" Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.134108 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7eb5a67c-c1a4-4294-bc90-b061efbffca9-console-serving-cert\") pod \"console-6f4549b8d9-j8tjt\" (UID: \"7eb5a67c-c1a4-4294-bc90-b061efbffca9\") " pod="openshift-console/console-6f4549b8d9-j8tjt" Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.146490 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zgk9\" (UniqueName: \"kubernetes.io/projected/7eb5a67c-c1a4-4294-bc90-b061efbffca9-kube-api-access-8zgk9\") pod \"console-6f4549b8d9-j8tjt\" (UID: \"7eb5a67c-c1a4-4294-bc90-b061efbffca9\") " pod="openshift-console/console-6f4549b8d9-j8tjt" Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.214077 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-6p9xx"] Dec 05 23:31:37 crc kubenswrapper[4734]: W1205 23:31:37.221906 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod408e85a8_5bd9_4c30_bd55_5262e3a2aa24.slice/crio-ccae070a442e5c9be25721ab8837f1193f6caab19fe4dd35e354cde4dfbe0363 WatchSource:0}: Error finding container ccae070a442e5c9be25721ab8837f1193f6caab19fe4dd35e354cde4dfbe0363: Status 404 returned error can't find the container with id ccae070a442e5c9be25721ab8837f1193f6caab19fe4dd35e354cde4dfbe0363 Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.252433 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-zxqpc"] Dec 05 23:31:37 crc kubenswrapper[4734]: W1205 23:31:37.265465 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bf99a15_c582_4a10_a26f_252c1c870f55.slice/crio-088d3ac30391290dd060807ec61340bd92de702ee699ac5413537b4879b725dc WatchSource:0}: Error finding container 088d3ac30391290dd060807ec61340bd92de702ee699ac5413537b4879b725dc: Status 404 returned error can't find the container with id 088d3ac30391290dd060807ec61340bd92de702ee699ac5413537b4879b725dc Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.288690 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f4549b8d9-j8tjt" Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.313252 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cr2t4"] Dec 05 23:31:37 crc kubenswrapper[4734]: W1205 23:31:37.320040 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod238e4a30_5ad1_4948_b27f_41e096f3095a.slice/crio-34f71fb7b658e436130cf8a40b2eddd18ffc821b56f0d4aa8f405c4fd7cd0f04 WatchSource:0}: Error finding container 34f71fb7b658e436130cf8a40b2eddd18ffc821b56f0d4aa8f405c4fd7cd0f04: Status 404 returned error can't find the container with id 34f71fb7b658e436130cf8a40b2eddd18ffc821b56f0d4aa8f405c4fd7cd0f04 Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.493779 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f4549b8d9-j8tjt"] Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.874660 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f4549b8d9-j8tjt" event={"ID":"7eb5a67c-c1a4-4294-bc90-b061efbffca9","Type":"ContainerStarted","Data":"b3d56e958ffd531e7293f0ee8c067b7f75a7253e25892f2371dcd79cc97f6167"} Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.874725 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f4549b8d9-j8tjt" event={"ID":"7eb5a67c-c1a4-4294-bc90-b061efbffca9","Type":"ContainerStarted","Data":"d011cab89b8f3d808363274bdf67724ac25b39fc62be59885d13408a50eb4ac1"} Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.877688 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cr2t4" event={"ID":"238e4a30-5ad1-4948-b27f-41e096f3095a","Type":"ContainerStarted","Data":"34f71fb7b658e436130cf8a40b2eddd18ffc821b56f0d4aa8f405c4fd7cd0f04"} Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.880061 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6p9xx" event={"ID":"408e85a8-5bd9-4c30-bd55-5262e3a2aa24","Type":"ContainerStarted","Data":"ccae070a442e5c9be25721ab8837f1193f6caab19fe4dd35e354cde4dfbe0363"} Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.881881 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-x728q" event={"ID":"ab1d36f1-0fc8-4ad6-8725-799c1838b033","Type":"ContainerStarted","Data":"47582f3a745250ad7458d093b806ac617ea3d58da434c7b4b22672a26bcd23e1"} Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.883108 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-zxqpc" event={"ID":"6bf99a15-c582-4a10-a26f-252c1c870f55","Type":"ContainerStarted","Data":"088d3ac30391290dd060807ec61340bd92de702ee699ac5413537b4879b725dc"} Dec 05 23:31:37 crc kubenswrapper[4734]: I1205 23:31:37.904519 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6f4549b8d9-j8tjt" podStartSLOduration=1.9044899370000001 podStartE2EDuration="1.904489937s" podCreationTimestamp="2025-12-05 23:31:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:31:37.898485572 +0000 UTC m=+718.581889888" watchObservedRunningTime="2025-12-05 23:31:37.904489937 +0000 UTC m=+718.587894233" Dec 05 23:31:40 crc kubenswrapper[4734]: I1205 23:31:40.903012 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6p9xx" event={"ID":"408e85a8-5bd9-4c30-bd55-5262e3a2aa24","Type":"ContainerStarted","Data":"af77e14435ab1b9985608e99d0620d1b201f3ceeb561ee921e235ac5c6d31739"} Dec 05 23:31:40 crc kubenswrapper[4734]: I1205 23:31:40.904574 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-x728q" event={"ID":"ab1d36f1-0fc8-4ad6-8725-799c1838b033","Type":"ContainerStarted","Data":"0c88ba4c62fcc5a8251412b3db952dede1d28dea817d7579ee93afca44d77c99"} Dec 05 23:31:40 crc kubenswrapper[4734]: I1205 23:31:40.904741 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-x728q" Dec 05 23:31:40 crc kubenswrapper[4734]: I1205 23:31:40.906246 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-zxqpc" event={"ID":"6bf99a15-c582-4a10-a26f-252c1c870f55","Type":"ContainerStarted","Data":"93174ae680f50f92cd7a76fcd434b116ec4da46379936497f51d70597d79e41b"} Dec 05 23:31:40 crc kubenswrapper[4734]: I1205 23:31:40.906395 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-zxqpc" Dec 05 23:31:40 crc kubenswrapper[4734]: I1205 23:31:40.907953 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cr2t4" event={"ID":"238e4a30-5ad1-4948-b27f-41e096f3095a","Type":"ContainerStarted","Data":"8b43a4c160f4d7f584e2e3624ecdbbc23226c65a9971111f7d8cc7c15fb36e9b"} Dec 05 23:31:40 crc kubenswrapper[4734]: I1205 23:31:40.925554 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-x728q" podStartSLOduration=1.62073496 podStartE2EDuration="4.925517511s" podCreationTimestamp="2025-12-05 23:31:36 +0000 UTC" firstStartedPulling="2025-12-05 23:31:36.972968723 +0000 UTC m=+717.656372999" lastFinishedPulling="2025-12-05 23:31:40.277751274 +0000 UTC m=+720.961155550" observedRunningTime="2025-12-05 23:31:40.920966511 +0000 UTC m=+721.604370817" watchObservedRunningTime="2025-12-05 23:31:40.925517511 +0000 UTC m=+721.608921787" Dec 05 23:31:40 crc kubenswrapper[4734]: I1205 23:31:40.945903 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cr2t4" podStartSLOduration=2.014580341 podStartE2EDuration="4.945867674s" podCreationTimestamp="2025-12-05 23:31:36 +0000 UTC" firstStartedPulling="2025-12-05 23:31:37.323420254 +0000 UTC m=+718.006824520" lastFinishedPulling="2025-12-05 23:31:40.254707577 +0000 UTC m=+720.938111853" observedRunningTime="2025-12-05 23:31:40.943144578 +0000 UTC m=+721.626548854" watchObservedRunningTime="2025-12-05 23:31:40.945867674 +0000 UTC m=+721.629271950" Dec 05 23:31:42 crc kubenswrapper[4734]: I1205 23:31:42.923831 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6p9xx" event={"ID":"408e85a8-5bd9-4c30-bd55-5262e3a2aa24","Type":"ContainerStarted","Data":"e5e5cef34673e0b1eeb892919d6617d0563361bd994b48fc9d8edd6b19e04be5"} Dec 05 23:31:42 crc kubenswrapper[4734]: I1205 23:31:42.949047 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6p9xx" podStartSLOduration=1.6342267160000001 podStartE2EDuration="6.949019623s" podCreationTimestamp="2025-12-05 23:31:36 +0000 UTC" firstStartedPulling="2025-12-05 23:31:37.229270566 +0000 UTC m=+717.912674842" lastFinishedPulling="2025-12-05 23:31:42.544063483 +0000 UTC m=+723.227467749" observedRunningTime="2025-12-05 23:31:42.948070961 +0000 UTC m=+723.631475237" watchObservedRunningTime="2025-12-05 23:31:42.949019623 +0000 UTC m=+723.632423899" Dec 05 23:31:42 crc kubenswrapper[4734]: I1205 23:31:42.951076 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-zxqpc" podStartSLOduration=3.961825148 podStartE2EDuration="6.951060813s" podCreationTimestamp="2025-12-05 23:31:36 +0000 UTC" firstStartedPulling="2025-12-05 23:31:37.267560052 +0000 UTC m=+717.950964328" lastFinishedPulling="2025-12-05 23:31:40.256795727 +0000 UTC m=+720.940199993" observedRunningTime="2025-12-05 23:31:40.977322815 +0000 UTC m=+721.660727081" watchObservedRunningTime="2025-12-05 23:31:42.951060813 +0000 UTC m=+723.634465089" Dec 05 23:31:46 crc kubenswrapper[4734]: I1205 23:31:46.935723 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-x728q" Dec 05 23:31:47 crc kubenswrapper[4734]: I1205 23:31:47.289284 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6f4549b8d9-j8tjt" Dec 05 23:31:47 crc kubenswrapper[4734]: I1205 23:31:47.289356 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6f4549b8d9-j8tjt" Dec 05 23:31:47 crc kubenswrapper[4734]: I1205 23:31:47.294454 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6f4549b8d9-j8tjt" Dec 05 23:31:47 crc kubenswrapper[4734]: I1205 23:31:47.971908 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6f4549b8d9-j8tjt" Dec 05 23:31:48 crc kubenswrapper[4734]: I1205 23:31:48.058960 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-5h9wr"] Dec 05 23:31:50 crc kubenswrapper[4734]: I1205 23:31:50.445334 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:31:50 crc kubenswrapper[4734]: I1205 23:31:50.446140 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:31:56 crc kubenswrapper[4734]: I1205 23:31:56.904632 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-zxqpc" Dec 05 23:32:12 crc kubenswrapper[4734]: I1205 23:32:12.483009 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6"] Dec 05 23:32:12 crc kubenswrapper[4734]: I1205 23:32:12.484991 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6" Dec 05 23:32:12 crc kubenswrapper[4734]: I1205 23:32:12.492210 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6"] Dec 05 23:32:12 crc kubenswrapper[4734]: I1205 23:32:12.492474 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 23:32:12 crc kubenswrapper[4734]: I1205 23:32:12.585399 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13194382-29bc-40a1-8f25-9566b13ad6ae-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6\" (UID: \"13194382-29bc-40a1-8f25-9566b13ad6ae\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6" Dec 05 23:32:12 crc kubenswrapper[4734]: I1205 23:32:12.585486 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzxw8\" (UniqueName: \"kubernetes.io/projected/13194382-29bc-40a1-8f25-9566b13ad6ae-kube-api-access-wzxw8\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6\" (UID: \"13194382-29bc-40a1-8f25-9566b13ad6ae\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6" Dec 05 23:32:12 crc kubenswrapper[4734]: I1205 23:32:12.585586 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13194382-29bc-40a1-8f25-9566b13ad6ae-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6\" (UID: \"13194382-29bc-40a1-8f25-9566b13ad6ae\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6" Dec 05 23:32:12 crc kubenswrapper[4734]: I1205 23:32:12.686890 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzxw8\" (UniqueName: \"kubernetes.io/projected/13194382-29bc-40a1-8f25-9566b13ad6ae-kube-api-access-wzxw8\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6\" (UID: \"13194382-29bc-40a1-8f25-9566b13ad6ae\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6" Dec 05 23:32:12 crc kubenswrapper[4734]: I1205 23:32:12.686994 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13194382-29bc-40a1-8f25-9566b13ad6ae-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6\" (UID: \"13194382-29bc-40a1-8f25-9566b13ad6ae\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6" Dec 05 23:32:12 crc kubenswrapper[4734]: I1205 23:32:12.687053 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13194382-29bc-40a1-8f25-9566b13ad6ae-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6\" (UID: \"13194382-29bc-40a1-8f25-9566b13ad6ae\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6" Dec 05 23:32:12 crc kubenswrapper[4734]: I1205 23:32:12.687710 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13194382-29bc-40a1-8f25-9566b13ad6ae-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6\" (UID: \"13194382-29bc-40a1-8f25-9566b13ad6ae\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6" Dec 05 23:32:12 crc kubenswrapper[4734]: I1205 23:32:12.687816 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13194382-29bc-40a1-8f25-9566b13ad6ae-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6\" (UID: \"13194382-29bc-40a1-8f25-9566b13ad6ae\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6" Dec 05 23:32:12 crc kubenswrapper[4734]: I1205 23:32:12.723833 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzxw8\" (UniqueName: \"kubernetes.io/projected/13194382-29bc-40a1-8f25-9566b13ad6ae-kube-api-access-wzxw8\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6\" (UID: \"13194382-29bc-40a1-8f25-9566b13ad6ae\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6" Dec 05 23:32:12 crc kubenswrapper[4734]: I1205 23:32:12.802907 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6" Dec 05 23:32:13 crc kubenswrapper[4734]: I1205 23:32:13.055798 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6"] Dec 05 23:32:13 crc kubenswrapper[4734]: I1205 23:32:13.104624 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-5h9wr" podUID="790e28b3-bfd6-40f2-8bd4-272fc91b9ffe" containerName="console" containerID="cri-o://a4a73def955379da1e0b0bb1e059e2e98a293893f07530194f7dd08dad90854a" gracePeriod=15 Dec 05 23:32:13 crc kubenswrapper[4734]: I1205 23:32:13.148923 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6" event={"ID":"13194382-29bc-40a1-8f25-9566b13ad6ae","Type":"ContainerStarted","Data":"fac441694b9f56b43d665ad2be89d14cba0e5e0bae7d8e2917a2e06a29ce0cc4"} Dec 05 23:32:13 crc kubenswrapper[4734]: I1205 23:32:13.635108 4734 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.030547 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-5h9wr_790e28b3-bfd6-40f2-8bd4-272fc91b9ffe/console/0.log" Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.031157 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.113088 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-oauth-serving-cert\") pod \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\" (UID: \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\") " Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.113144 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-trusted-ca-bundle\") pod \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\" (UID: \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\") " Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.113204 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvcxp\" (UniqueName: \"kubernetes.io/projected/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-kube-api-access-qvcxp\") pod \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\" (UID: \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\") " Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.113239 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-service-ca\") pod \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\" (UID: \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\") " Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.113266 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-console-oauth-config\") pod \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\" (UID: \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\") " Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.113293 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-console-config\") pod \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\" (UID: \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\") " Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.113404 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-console-serving-cert\") pod \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\" (UID: \"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe\") " Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.114676 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "790e28b3-bfd6-40f2-8bd4-272fc91b9ffe" (UID: "790e28b3-bfd6-40f2-8bd4-272fc91b9ffe"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.114735 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "790e28b3-bfd6-40f2-8bd4-272fc91b9ffe" (UID: "790e28b3-bfd6-40f2-8bd4-272fc91b9ffe"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.114876 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-service-ca" (OuterVolumeSpecName: "service-ca") pod "790e28b3-bfd6-40f2-8bd4-272fc91b9ffe" (UID: "790e28b3-bfd6-40f2-8bd4-272fc91b9ffe"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.115838 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-console-config" (OuterVolumeSpecName: "console-config") pod "790e28b3-bfd6-40f2-8bd4-272fc91b9ffe" (UID: "790e28b3-bfd6-40f2-8bd4-272fc91b9ffe"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.121586 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-kube-api-access-qvcxp" (OuterVolumeSpecName: "kube-api-access-qvcxp") pod "790e28b3-bfd6-40f2-8bd4-272fc91b9ffe" (UID: "790e28b3-bfd6-40f2-8bd4-272fc91b9ffe"). InnerVolumeSpecName "kube-api-access-qvcxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.121986 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "790e28b3-bfd6-40f2-8bd4-272fc91b9ffe" (UID: "790e28b3-bfd6-40f2-8bd4-272fc91b9ffe"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.123498 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "790e28b3-bfd6-40f2-8bd4-272fc91b9ffe" (UID: "790e28b3-bfd6-40f2-8bd4-272fc91b9ffe"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.157595 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-5h9wr_790e28b3-bfd6-40f2-8bd4-272fc91b9ffe/console/0.log" Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.157662 4734 generic.go:334] "Generic (PLEG): container finished" podID="790e28b3-bfd6-40f2-8bd4-272fc91b9ffe" containerID="a4a73def955379da1e0b0bb1e059e2e98a293893f07530194f7dd08dad90854a" exitCode=2 Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.157772 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5h9wr" Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.157779 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5h9wr" event={"ID":"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe","Type":"ContainerDied","Data":"a4a73def955379da1e0b0bb1e059e2e98a293893f07530194f7dd08dad90854a"} Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.157896 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5h9wr" event={"ID":"790e28b3-bfd6-40f2-8bd4-272fc91b9ffe","Type":"ContainerDied","Data":"396ccce6799b68b38f3167cbd02e540c50f7f54cca577b1a2f46a61791f58ef7"} Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.157939 4734 scope.go:117] "RemoveContainer" containerID="a4a73def955379da1e0b0bb1e059e2e98a293893f07530194f7dd08dad90854a" Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.160060 4734 generic.go:334] "Generic (PLEG): container finished" podID="13194382-29bc-40a1-8f25-9566b13ad6ae" containerID="079e3887bf7d73b71ffe34b76d57a54decac8686018304c3d49507c98f5be464" exitCode=0 Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.160106 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6" event={"ID":"13194382-29bc-40a1-8f25-9566b13ad6ae","Type":"ContainerDied","Data":"079e3887bf7d73b71ffe34b76d57a54decac8686018304c3d49507c98f5be464"} Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.193896 4734 scope.go:117] "RemoveContainer" containerID="a4a73def955379da1e0b0bb1e059e2e98a293893f07530194f7dd08dad90854a" Dec 05 23:32:14 crc kubenswrapper[4734]: E1205 23:32:14.194947 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4a73def955379da1e0b0bb1e059e2e98a293893f07530194f7dd08dad90854a\": container with ID starting with a4a73def955379da1e0b0bb1e059e2e98a293893f07530194f7dd08dad90854a not found: ID does not exist" containerID="a4a73def955379da1e0b0bb1e059e2e98a293893f07530194f7dd08dad90854a" Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.195021 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4a73def955379da1e0b0bb1e059e2e98a293893f07530194f7dd08dad90854a"} err="failed to get container status \"a4a73def955379da1e0b0bb1e059e2e98a293893f07530194f7dd08dad90854a\": rpc error: code = NotFound desc = could not find container \"a4a73def955379da1e0b0bb1e059e2e98a293893f07530194f7dd08dad90854a\": container with ID starting with a4a73def955379da1e0b0bb1e059e2e98a293893f07530194f7dd08dad90854a not found: ID does not exist" Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.212985 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-5h9wr"] Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.215213 4734 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.215239 4734 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.215250 4734 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.215259 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvcxp\" (UniqueName: \"kubernetes.io/projected/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-kube-api-access-qvcxp\") on node \"crc\" DevicePath \"\"" Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.215460 4734 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.215479 4734 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.215488 4734 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe-console-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.218194 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-5h9wr"] Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.841197 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cz6xv"] Dec 05 23:32:14 crc kubenswrapper[4734]: E1205 23:32:14.842054 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="790e28b3-bfd6-40f2-8bd4-272fc91b9ffe" containerName="console" Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.842073 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="790e28b3-bfd6-40f2-8bd4-272fc91b9ffe" containerName="console" Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.842224 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="790e28b3-bfd6-40f2-8bd4-272fc91b9ffe" containerName="console" Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.843672 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cz6xv" Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.861503 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cz6xv"] Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.928417 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9g9b\" (UniqueName: \"kubernetes.io/projected/40e54803-9b8a-48ff-a4be-9cd955d7031d-kube-api-access-z9g9b\") pod \"redhat-operators-cz6xv\" (UID: \"40e54803-9b8a-48ff-a4be-9cd955d7031d\") " pod="openshift-marketplace/redhat-operators-cz6xv" Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.928480 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40e54803-9b8a-48ff-a4be-9cd955d7031d-catalog-content\") pod \"redhat-operators-cz6xv\" (UID: \"40e54803-9b8a-48ff-a4be-9cd955d7031d\") " pod="openshift-marketplace/redhat-operators-cz6xv" Dec 05 23:32:14 crc kubenswrapper[4734]: I1205 23:32:14.928566 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40e54803-9b8a-48ff-a4be-9cd955d7031d-utilities\") pod \"redhat-operators-cz6xv\" (UID: \"40e54803-9b8a-48ff-a4be-9cd955d7031d\") " pod="openshift-marketplace/redhat-operators-cz6xv" Dec 05 23:32:15 crc kubenswrapper[4734]: I1205 23:32:15.029941 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9g9b\" (UniqueName: \"kubernetes.io/projected/40e54803-9b8a-48ff-a4be-9cd955d7031d-kube-api-access-z9g9b\") pod \"redhat-operators-cz6xv\" (UID: \"40e54803-9b8a-48ff-a4be-9cd955d7031d\") " pod="openshift-marketplace/redhat-operators-cz6xv" Dec 05 23:32:15 crc kubenswrapper[4734]: I1205 23:32:15.030013 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40e54803-9b8a-48ff-a4be-9cd955d7031d-catalog-content\") pod \"redhat-operators-cz6xv\" (UID: \"40e54803-9b8a-48ff-a4be-9cd955d7031d\") " pod="openshift-marketplace/redhat-operators-cz6xv" Dec 05 23:32:15 crc kubenswrapper[4734]: I1205 23:32:15.030076 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40e54803-9b8a-48ff-a4be-9cd955d7031d-utilities\") pod \"redhat-operators-cz6xv\" (UID: \"40e54803-9b8a-48ff-a4be-9cd955d7031d\") " pod="openshift-marketplace/redhat-operators-cz6xv" Dec 05 23:32:15 crc kubenswrapper[4734]: I1205 23:32:15.030712 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40e54803-9b8a-48ff-a4be-9cd955d7031d-catalog-content\") pod \"redhat-operators-cz6xv\" (UID: \"40e54803-9b8a-48ff-a4be-9cd955d7031d\") " pod="openshift-marketplace/redhat-operators-cz6xv" Dec 05 23:32:15 crc kubenswrapper[4734]: I1205 23:32:15.050557 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9g9b\" (UniqueName: \"kubernetes.io/projected/40e54803-9b8a-48ff-a4be-9cd955d7031d-kube-api-access-z9g9b\") pod \"redhat-operators-cz6xv\" (UID: \"40e54803-9b8a-48ff-a4be-9cd955d7031d\") " pod="openshift-marketplace/redhat-operators-cz6xv" Dec 05 23:32:15 crc kubenswrapper[4734]: I1205 23:32:15.625000 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="790e28b3-bfd6-40f2-8bd4-272fc91b9ffe" path="/var/lib/kubelet/pods/790e28b3-bfd6-40f2-8bd4-272fc91b9ffe/volumes" Dec 05 23:32:15 crc kubenswrapper[4734]: I1205 23:32:15.946114 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40e54803-9b8a-48ff-a4be-9cd955d7031d-utilities\") pod \"redhat-operators-cz6xv\" (UID: \"40e54803-9b8a-48ff-a4be-9cd955d7031d\") " pod="openshift-marketplace/redhat-operators-cz6xv" Dec 05 23:32:16 crc kubenswrapper[4734]: I1205 23:32:16.081162 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cz6xv" Dec 05 23:32:16 crc kubenswrapper[4734]: I1205 23:32:16.509994 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cz6xv"] Dec 05 23:32:16 crc kubenswrapper[4734]: W1205 23:32:16.510746 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40e54803_9b8a_48ff_a4be_9cd955d7031d.slice/crio-262b2532b7bd99b1831bcd71ed2480db0ba303a9155cfb30b8f3a5315818e6ec WatchSource:0}: Error finding container 262b2532b7bd99b1831bcd71ed2480db0ba303a9155cfb30b8f3a5315818e6ec: Status 404 returned error can't find the container with id 262b2532b7bd99b1831bcd71ed2480db0ba303a9155cfb30b8f3a5315818e6ec Dec 05 23:32:17 crc kubenswrapper[4734]: I1205 23:32:17.194039 4734 generic.go:334] "Generic (PLEG): container finished" podID="13194382-29bc-40a1-8f25-9566b13ad6ae" containerID="6af59b8fb1610f4019c2694648b791e28c0e155f627937a06f40c3ce4f5d6683" exitCode=0 Dec 05 23:32:17 crc kubenswrapper[4734]: I1205 23:32:17.194119 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6" event={"ID":"13194382-29bc-40a1-8f25-9566b13ad6ae","Type":"ContainerDied","Data":"6af59b8fb1610f4019c2694648b791e28c0e155f627937a06f40c3ce4f5d6683"} Dec 05 23:32:17 crc kubenswrapper[4734]: I1205 23:32:17.198739 4734 generic.go:334] "Generic (PLEG): container finished" podID="40e54803-9b8a-48ff-a4be-9cd955d7031d" containerID="7761a635c251df514e1069efa816474532cc038fa776b36a774f2e1db0a96c0d" exitCode=0 Dec 05 23:32:17 crc kubenswrapper[4734]: I1205 23:32:17.198797 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cz6xv" event={"ID":"40e54803-9b8a-48ff-a4be-9cd955d7031d","Type":"ContainerDied","Data":"7761a635c251df514e1069efa816474532cc038fa776b36a774f2e1db0a96c0d"} Dec 05 23:32:17 crc kubenswrapper[4734]: I1205 23:32:17.198831 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cz6xv" event={"ID":"40e54803-9b8a-48ff-a4be-9cd955d7031d","Type":"ContainerStarted","Data":"262b2532b7bd99b1831bcd71ed2480db0ba303a9155cfb30b8f3a5315818e6ec"} Dec 05 23:32:18 crc kubenswrapper[4734]: I1205 23:32:18.208704 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cz6xv" event={"ID":"40e54803-9b8a-48ff-a4be-9cd955d7031d","Type":"ContainerStarted","Data":"32a732f6e6dffd260ae95ad8156db2498a5f94d6a26d4f5c5f4bd88246aad058"} Dec 05 23:32:18 crc kubenswrapper[4734]: I1205 23:32:18.214326 4734 generic.go:334] "Generic (PLEG): container finished" podID="13194382-29bc-40a1-8f25-9566b13ad6ae" containerID="e300921ad07a2a9caabef5811908f26497bf9c08dd88f82df54fabdd25937cfb" exitCode=0 Dec 05 23:32:18 crc kubenswrapper[4734]: I1205 23:32:18.214402 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6" event={"ID":"13194382-29bc-40a1-8f25-9566b13ad6ae","Type":"ContainerDied","Data":"e300921ad07a2a9caabef5811908f26497bf9c08dd88f82df54fabdd25937cfb"} Dec 05 23:32:19 crc kubenswrapper[4734]: I1205 23:32:19.505172 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6" Dec 05 23:32:19 crc kubenswrapper[4734]: I1205 23:32:19.599997 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13194382-29bc-40a1-8f25-9566b13ad6ae-util\") pod \"13194382-29bc-40a1-8f25-9566b13ad6ae\" (UID: \"13194382-29bc-40a1-8f25-9566b13ad6ae\") " Dec 05 23:32:19 crc kubenswrapper[4734]: I1205 23:32:19.600440 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13194382-29bc-40a1-8f25-9566b13ad6ae-bundle\") pod \"13194382-29bc-40a1-8f25-9566b13ad6ae\" (UID: \"13194382-29bc-40a1-8f25-9566b13ad6ae\") " Dec 05 23:32:19 crc kubenswrapper[4734]: I1205 23:32:19.600482 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzxw8\" (UniqueName: \"kubernetes.io/projected/13194382-29bc-40a1-8f25-9566b13ad6ae-kube-api-access-wzxw8\") pod \"13194382-29bc-40a1-8f25-9566b13ad6ae\" (UID: \"13194382-29bc-40a1-8f25-9566b13ad6ae\") " Dec 05 23:32:19 crc kubenswrapper[4734]: I1205 23:32:19.601993 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13194382-29bc-40a1-8f25-9566b13ad6ae-bundle" (OuterVolumeSpecName: "bundle") pod "13194382-29bc-40a1-8f25-9566b13ad6ae" (UID: "13194382-29bc-40a1-8f25-9566b13ad6ae"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:32:19 crc kubenswrapper[4734]: I1205 23:32:19.619309 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13194382-29bc-40a1-8f25-9566b13ad6ae-kube-api-access-wzxw8" (OuterVolumeSpecName: "kube-api-access-wzxw8") pod "13194382-29bc-40a1-8f25-9566b13ad6ae" (UID: "13194382-29bc-40a1-8f25-9566b13ad6ae"). InnerVolumeSpecName "kube-api-access-wzxw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:32:19 crc kubenswrapper[4734]: I1205 23:32:19.628477 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13194382-29bc-40a1-8f25-9566b13ad6ae-util" (OuterVolumeSpecName: "util") pod "13194382-29bc-40a1-8f25-9566b13ad6ae" (UID: "13194382-29bc-40a1-8f25-9566b13ad6ae"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:32:19 crc kubenswrapper[4734]: I1205 23:32:19.702567 4734 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13194382-29bc-40a1-8f25-9566b13ad6ae-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:32:19 crc kubenswrapper[4734]: I1205 23:32:19.702599 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzxw8\" (UniqueName: \"kubernetes.io/projected/13194382-29bc-40a1-8f25-9566b13ad6ae-kube-api-access-wzxw8\") on node \"crc\" DevicePath \"\"" Dec 05 23:32:19 crc kubenswrapper[4734]: I1205 23:32:19.702611 4734 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13194382-29bc-40a1-8f25-9566b13ad6ae-util\") on node \"crc\" DevicePath \"\"" Dec 05 23:32:20 crc kubenswrapper[4734]: I1205 23:32:20.230102 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6" event={"ID":"13194382-29bc-40a1-8f25-9566b13ad6ae","Type":"ContainerDied","Data":"fac441694b9f56b43d665ad2be89d14cba0e5e0bae7d8e2917a2e06a29ce0cc4"} Dec 05 23:32:20 crc kubenswrapper[4734]: I1205 23:32:20.230150 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fac441694b9f56b43d665ad2be89d14cba0e5e0bae7d8e2917a2e06a29ce0cc4" Dec 05 23:32:20 crc kubenswrapper[4734]: I1205 23:32:20.230233 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6" Dec 05 23:32:20 crc kubenswrapper[4734]: I1205 23:32:20.235549 4734 generic.go:334] "Generic (PLEG): container finished" podID="40e54803-9b8a-48ff-a4be-9cd955d7031d" containerID="32a732f6e6dffd260ae95ad8156db2498a5f94d6a26d4f5c5f4bd88246aad058" exitCode=0 Dec 05 23:32:20 crc kubenswrapper[4734]: I1205 23:32:20.235618 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cz6xv" event={"ID":"40e54803-9b8a-48ff-a4be-9cd955d7031d","Type":"ContainerDied","Data":"32a732f6e6dffd260ae95ad8156db2498a5f94d6a26d4f5c5f4bd88246aad058"} Dec 05 23:32:20 crc kubenswrapper[4734]: I1205 23:32:20.445679 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:32:20 crc kubenswrapper[4734]: I1205 23:32:20.445752 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:32:21 crc kubenswrapper[4734]: I1205 23:32:21.245277 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cz6xv" event={"ID":"40e54803-9b8a-48ff-a4be-9cd955d7031d","Type":"ContainerStarted","Data":"c3cf9713e62176229e38bcf0196f7deacc663626f817534b6a4db864f0114d93"} Dec 05 23:32:21 crc kubenswrapper[4734]: I1205 23:32:21.267491 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cz6xv" podStartSLOduration=3.696263773 podStartE2EDuration="7.267460442s" podCreationTimestamp="2025-12-05 23:32:14 +0000 UTC" firstStartedPulling="2025-12-05 23:32:17.203100748 +0000 UTC m=+757.886505064" lastFinishedPulling="2025-12-05 23:32:20.774297457 +0000 UTC m=+761.457701733" observedRunningTime="2025-12-05 23:32:21.263252001 +0000 UTC m=+761.946656297" watchObservedRunningTime="2025-12-05 23:32:21.267460442 +0000 UTC m=+761.950864718" Dec 05 23:32:26 crc kubenswrapper[4734]: I1205 23:32:26.082369 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cz6xv" Dec 05 23:32:26 crc kubenswrapper[4734]: I1205 23:32:26.082741 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cz6xv" Dec 05 23:32:27 crc kubenswrapper[4734]: I1205 23:32:27.133551 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cz6xv" podUID="40e54803-9b8a-48ff-a4be-9cd955d7031d" containerName="registry-server" probeResult="failure" output=< Dec 05 23:32:27 crc kubenswrapper[4734]: timeout: failed to connect service ":50051" within 1s Dec 05 23:32:27 crc kubenswrapper[4734]: > Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.476029 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5b69785d4f-gksx6"] Dec 05 23:32:29 crc kubenswrapper[4734]: E1205 23:32:29.476854 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13194382-29bc-40a1-8f25-9566b13ad6ae" containerName="util" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.476873 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="13194382-29bc-40a1-8f25-9566b13ad6ae" containerName="util" Dec 05 23:32:29 crc kubenswrapper[4734]: E1205 23:32:29.476889 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13194382-29bc-40a1-8f25-9566b13ad6ae" containerName="extract" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.476897 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="13194382-29bc-40a1-8f25-9566b13ad6ae" containerName="extract" Dec 05 23:32:29 crc kubenswrapper[4734]: E1205 23:32:29.476909 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13194382-29bc-40a1-8f25-9566b13ad6ae" containerName="pull" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.476917 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="13194382-29bc-40a1-8f25-9566b13ad6ae" containerName="pull" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.477051 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="13194382-29bc-40a1-8f25-9566b13ad6ae" containerName="extract" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.482092 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5b69785d4f-gksx6" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.486057 4734 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.486193 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.486073 4734 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.486277 4734 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-h4l5z" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.489802 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.500701 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5b69785d4f-gksx6"] Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.554358 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/df912953-69c4-4841-abb5-afa544bd8df7-apiservice-cert\") pod \"metallb-operator-controller-manager-5b69785d4f-gksx6\" (UID: \"df912953-69c4-4841-abb5-afa544bd8df7\") " pod="metallb-system/metallb-operator-controller-manager-5b69785d4f-gksx6" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.554870 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjcrk\" (UniqueName: \"kubernetes.io/projected/df912953-69c4-4841-abb5-afa544bd8df7-kube-api-access-wjcrk\") pod \"metallb-operator-controller-manager-5b69785d4f-gksx6\" (UID: \"df912953-69c4-4841-abb5-afa544bd8df7\") " pod="metallb-system/metallb-operator-controller-manager-5b69785d4f-gksx6" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.555011 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/df912953-69c4-4841-abb5-afa544bd8df7-webhook-cert\") pod \"metallb-operator-controller-manager-5b69785d4f-gksx6\" (UID: \"df912953-69c4-4841-abb5-afa544bd8df7\") " pod="metallb-system/metallb-operator-controller-manager-5b69785d4f-gksx6" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.658675 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjcrk\" (UniqueName: \"kubernetes.io/projected/df912953-69c4-4841-abb5-afa544bd8df7-kube-api-access-wjcrk\") pod \"metallb-operator-controller-manager-5b69785d4f-gksx6\" (UID: \"df912953-69c4-4841-abb5-afa544bd8df7\") " pod="metallb-system/metallb-operator-controller-manager-5b69785d4f-gksx6" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.659006 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/df912953-69c4-4841-abb5-afa544bd8df7-webhook-cert\") pod \"metallb-operator-controller-manager-5b69785d4f-gksx6\" (UID: \"df912953-69c4-4841-abb5-afa544bd8df7\") " pod="metallb-system/metallb-operator-controller-manager-5b69785d4f-gksx6" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.659173 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/df912953-69c4-4841-abb5-afa544bd8df7-apiservice-cert\") pod \"metallb-operator-controller-manager-5b69785d4f-gksx6\" (UID: \"df912953-69c4-4841-abb5-afa544bd8df7\") " pod="metallb-system/metallb-operator-controller-manager-5b69785d4f-gksx6" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.670060 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/df912953-69c4-4841-abb5-afa544bd8df7-webhook-cert\") pod \"metallb-operator-controller-manager-5b69785d4f-gksx6\" (UID: \"df912953-69c4-4841-abb5-afa544bd8df7\") " pod="metallb-system/metallb-operator-controller-manager-5b69785d4f-gksx6" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.683978 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjcrk\" (UniqueName: \"kubernetes.io/projected/df912953-69c4-4841-abb5-afa544bd8df7-kube-api-access-wjcrk\") pod \"metallb-operator-controller-manager-5b69785d4f-gksx6\" (UID: \"df912953-69c4-4841-abb5-afa544bd8df7\") " pod="metallb-system/metallb-operator-controller-manager-5b69785d4f-gksx6" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.684200 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/df912953-69c4-4841-abb5-afa544bd8df7-apiservice-cert\") pod \"metallb-operator-controller-manager-5b69785d4f-gksx6\" (UID: \"df912953-69c4-4841-abb5-afa544bd8df7\") " pod="metallb-system/metallb-operator-controller-manager-5b69785d4f-gksx6" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.804759 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-8688474b6d-2dhr7"] Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.805693 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-8688474b6d-2dhr7" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.809511 4734 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.810394 4734 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.812138 4734 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-bdm96" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.814870 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5b69785d4f-gksx6" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.832278 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-8688474b6d-2dhr7"] Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.864093 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw2lc\" (UniqueName: \"kubernetes.io/projected/97e5de92-85a3-4262-a82f-5b7195d72a9c-kube-api-access-xw2lc\") pod \"metallb-operator-webhook-server-8688474b6d-2dhr7\" (UID: \"97e5de92-85a3-4262-a82f-5b7195d72a9c\") " pod="metallb-system/metallb-operator-webhook-server-8688474b6d-2dhr7" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.864162 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97e5de92-85a3-4262-a82f-5b7195d72a9c-webhook-cert\") pod \"metallb-operator-webhook-server-8688474b6d-2dhr7\" (UID: \"97e5de92-85a3-4262-a82f-5b7195d72a9c\") " pod="metallb-system/metallb-operator-webhook-server-8688474b6d-2dhr7" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.864190 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97e5de92-85a3-4262-a82f-5b7195d72a9c-apiservice-cert\") pod \"metallb-operator-webhook-server-8688474b6d-2dhr7\" (UID: \"97e5de92-85a3-4262-a82f-5b7195d72a9c\") " pod="metallb-system/metallb-operator-webhook-server-8688474b6d-2dhr7" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.967676 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw2lc\" (UniqueName: \"kubernetes.io/projected/97e5de92-85a3-4262-a82f-5b7195d72a9c-kube-api-access-xw2lc\") pod \"metallb-operator-webhook-server-8688474b6d-2dhr7\" (UID: \"97e5de92-85a3-4262-a82f-5b7195d72a9c\") " pod="metallb-system/metallb-operator-webhook-server-8688474b6d-2dhr7" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.968124 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97e5de92-85a3-4262-a82f-5b7195d72a9c-webhook-cert\") pod \"metallb-operator-webhook-server-8688474b6d-2dhr7\" (UID: \"97e5de92-85a3-4262-a82f-5b7195d72a9c\") " pod="metallb-system/metallb-operator-webhook-server-8688474b6d-2dhr7" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.968152 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97e5de92-85a3-4262-a82f-5b7195d72a9c-apiservice-cert\") pod \"metallb-operator-webhook-server-8688474b6d-2dhr7\" (UID: \"97e5de92-85a3-4262-a82f-5b7195d72a9c\") " pod="metallb-system/metallb-operator-webhook-server-8688474b6d-2dhr7" Dec 05 23:32:29 crc kubenswrapper[4734]: I1205 23:32:29.997575 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97e5de92-85a3-4262-a82f-5b7195d72a9c-webhook-cert\") pod \"metallb-operator-webhook-server-8688474b6d-2dhr7\" (UID: \"97e5de92-85a3-4262-a82f-5b7195d72a9c\") " pod="metallb-system/metallb-operator-webhook-server-8688474b6d-2dhr7" Dec 05 23:32:30 crc kubenswrapper[4734]: I1205 23:32:30.003325 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97e5de92-85a3-4262-a82f-5b7195d72a9c-apiservice-cert\") pod \"metallb-operator-webhook-server-8688474b6d-2dhr7\" (UID: \"97e5de92-85a3-4262-a82f-5b7195d72a9c\") " pod="metallb-system/metallb-operator-webhook-server-8688474b6d-2dhr7" Dec 05 23:32:30 crc kubenswrapper[4734]: I1205 23:32:30.037559 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw2lc\" (UniqueName: \"kubernetes.io/projected/97e5de92-85a3-4262-a82f-5b7195d72a9c-kube-api-access-xw2lc\") pod \"metallb-operator-webhook-server-8688474b6d-2dhr7\" (UID: \"97e5de92-85a3-4262-a82f-5b7195d72a9c\") " pod="metallb-system/metallb-operator-webhook-server-8688474b6d-2dhr7" Dec 05 23:32:30 crc kubenswrapper[4734]: I1205 23:32:30.131495 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-8688474b6d-2dhr7" Dec 05 23:32:30 crc kubenswrapper[4734]: I1205 23:32:30.236860 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5b69785d4f-gksx6"] Dec 05 23:32:30 crc kubenswrapper[4734]: W1205 23:32:30.243322 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf912953_69c4_4841_abb5_afa544bd8df7.slice/crio-e7ab57f1787627dd7932b623c9fd281c4ce9c01ce20c11e282c25570b4b18096 WatchSource:0}: Error finding container e7ab57f1787627dd7932b623c9fd281c4ce9c01ce20c11e282c25570b4b18096: Status 404 returned error can't find the container with id e7ab57f1787627dd7932b623c9fd281c4ce9c01ce20c11e282c25570b4b18096 Dec 05 23:32:30 crc kubenswrapper[4734]: I1205 23:32:30.301744 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5b69785d4f-gksx6" event={"ID":"df912953-69c4-4841-abb5-afa544bd8df7","Type":"ContainerStarted","Data":"e7ab57f1787627dd7932b623c9fd281c4ce9c01ce20c11e282c25570b4b18096"} Dec 05 23:32:30 crc kubenswrapper[4734]: I1205 23:32:30.410246 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-8688474b6d-2dhr7"] Dec 05 23:32:31 crc kubenswrapper[4734]: I1205 23:32:31.308998 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-8688474b6d-2dhr7" event={"ID":"97e5de92-85a3-4262-a82f-5b7195d72a9c","Type":"ContainerStarted","Data":"25d6ec799d63316167e8015a44e3d056c7fef0bad7b2fcbeb35ce9e78ba2aee1"} Dec 05 23:32:36 crc kubenswrapper[4734]: I1205 23:32:36.165188 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cz6xv" Dec 05 23:32:36 crc kubenswrapper[4734]: I1205 23:32:36.222735 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cz6xv" Dec 05 23:32:38 crc kubenswrapper[4734]: I1205 23:32:38.028054 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cz6xv"] Dec 05 23:32:38 crc kubenswrapper[4734]: I1205 23:32:38.028762 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cz6xv" podUID="40e54803-9b8a-48ff-a4be-9cd955d7031d" containerName="registry-server" containerID="cri-o://c3cf9713e62176229e38bcf0196f7deacc663626f817534b6a4db864f0114d93" gracePeriod=2 Dec 05 23:32:38 crc kubenswrapper[4734]: I1205 23:32:38.370588 4734 generic.go:334] "Generic (PLEG): container finished" podID="40e54803-9b8a-48ff-a4be-9cd955d7031d" containerID="c3cf9713e62176229e38bcf0196f7deacc663626f817534b6a4db864f0114d93" exitCode=0 Dec 05 23:32:38 crc kubenswrapper[4734]: I1205 23:32:38.370660 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cz6xv" event={"ID":"40e54803-9b8a-48ff-a4be-9cd955d7031d","Type":"ContainerDied","Data":"c3cf9713e62176229e38bcf0196f7deacc663626f817534b6a4db864f0114d93"} Dec 05 23:32:39 crc kubenswrapper[4734]: I1205 23:32:39.023631 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cz6xv" Dec 05 23:32:39 crc kubenswrapper[4734]: I1205 23:32:39.138154 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9g9b\" (UniqueName: \"kubernetes.io/projected/40e54803-9b8a-48ff-a4be-9cd955d7031d-kube-api-access-z9g9b\") pod \"40e54803-9b8a-48ff-a4be-9cd955d7031d\" (UID: \"40e54803-9b8a-48ff-a4be-9cd955d7031d\") " Dec 05 23:32:39 crc kubenswrapper[4734]: I1205 23:32:39.138237 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40e54803-9b8a-48ff-a4be-9cd955d7031d-catalog-content\") pod \"40e54803-9b8a-48ff-a4be-9cd955d7031d\" (UID: \"40e54803-9b8a-48ff-a4be-9cd955d7031d\") " Dec 05 23:32:39 crc kubenswrapper[4734]: I1205 23:32:39.138428 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40e54803-9b8a-48ff-a4be-9cd955d7031d-utilities\") pod \"40e54803-9b8a-48ff-a4be-9cd955d7031d\" (UID: \"40e54803-9b8a-48ff-a4be-9cd955d7031d\") " Dec 05 23:32:39 crc kubenswrapper[4734]: I1205 23:32:39.139689 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40e54803-9b8a-48ff-a4be-9cd955d7031d-utilities" (OuterVolumeSpecName: "utilities") pod "40e54803-9b8a-48ff-a4be-9cd955d7031d" (UID: "40e54803-9b8a-48ff-a4be-9cd955d7031d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:32:39 crc kubenswrapper[4734]: I1205 23:32:39.149814 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40e54803-9b8a-48ff-a4be-9cd955d7031d-kube-api-access-z9g9b" (OuterVolumeSpecName: "kube-api-access-z9g9b") pod "40e54803-9b8a-48ff-a4be-9cd955d7031d" (UID: "40e54803-9b8a-48ff-a4be-9cd955d7031d"). InnerVolumeSpecName "kube-api-access-z9g9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:32:39 crc kubenswrapper[4734]: I1205 23:32:39.240511 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9g9b\" (UniqueName: \"kubernetes.io/projected/40e54803-9b8a-48ff-a4be-9cd955d7031d-kube-api-access-z9g9b\") on node \"crc\" DevicePath \"\"" Dec 05 23:32:39 crc kubenswrapper[4734]: I1205 23:32:39.240569 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40e54803-9b8a-48ff-a4be-9cd955d7031d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:32:39 crc kubenswrapper[4734]: I1205 23:32:39.256010 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40e54803-9b8a-48ff-a4be-9cd955d7031d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40e54803-9b8a-48ff-a4be-9cd955d7031d" (UID: "40e54803-9b8a-48ff-a4be-9cd955d7031d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:32:39 crc kubenswrapper[4734]: I1205 23:32:39.341999 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40e54803-9b8a-48ff-a4be-9cd955d7031d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:32:39 crc kubenswrapper[4734]: I1205 23:32:39.378994 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cz6xv" event={"ID":"40e54803-9b8a-48ff-a4be-9cd955d7031d","Type":"ContainerDied","Data":"262b2532b7bd99b1831bcd71ed2480db0ba303a9155cfb30b8f3a5315818e6ec"} Dec 05 23:32:39 crc kubenswrapper[4734]: I1205 23:32:39.379411 4734 scope.go:117] "RemoveContainer" containerID="c3cf9713e62176229e38bcf0196f7deacc663626f817534b6a4db864f0114d93" Dec 05 23:32:39 crc kubenswrapper[4734]: I1205 23:32:39.379129 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cz6xv" Dec 05 23:32:39 crc kubenswrapper[4734]: I1205 23:32:39.380817 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5b69785d4f-gksx6" event={"ID":"df912953-69c4-4841-abb5-afa544bd8df7","Type":"ContainerStarted","Data":"8b178fa81bb661c8275db0eb1eb46b10a2362d2550bec351dae0c5ad1b2f5c05"} Dec 05 23:32:39 crc kubenswrapper[4734]: I1205 23:32:39.381026 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5b69785d4f-gksx6" Dec 05 23:32:39 crc kubenswrapper[4734]: I1205 23:32:39.382731 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-8688474b6d-2dhr7" event={"ID":"97e5de92-85a3-4262-a82f-5b7195d72a9c","Type":"ContainerStarted","Data":"1a4d84f9da7b5027308128043d22050b13c4979541db661a67191387ccee3d55"} Dec 05 23:32:39 crc kubenswrapper[4734]: I1205 23:32:39.382871 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-8688474b6d-2dhr7" Dec 05 23:32:39 crc kubenswrapper[4734]: I1205 23:32:39.398928 4734 scope.go:117] "RemoveContainer" containerID="32a732f6e6dffd260ae95ad8156db2498a5f94d6a26d4f5c5f4bd88246aad058" Dec 05 23:32:39 crc kubenswrapper[4734]: I1205 23:32:39.412018 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-8688474b6d-2dhr7" podStartSLOduration=2.104013859 podStartE2EDuration="10.411991246s" podCreationTimestamp="2025-12-05 23:32:29 +0000 UTC" firstStartedPulling="2025-12-05 23:32:30.420699855 +0000 UTC m=+771.104104131" lastFinishedPulling="2025-12-05 23:32:38.728677242 +0000 UTC m=+779.412081518" observedRunningTime="2025-12-05 23:32:39.40633185 +0000 UTC m=+780.089736126" watchObservedRunningTime="2025-12-05 23:32:39.411991246 +0000 UTC m=+780.095395522" Dec 05 23:32:39 crc kubenswrapper[4734]: I1205 23:32:39.427028 4734 scope.go:117] "RemoveContainer" containerID="7761a635c251df514e1069efa816474532cc038fa776b36a774f2e1db0a96c0d" Dec 05 23:32:39 crc kubenswrapper[4734]: I1205 23:32:39.438102 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5b69785d4f-gksx6" podStartSLOduration=1.978407518 podStartE2EDuration="10.438067556s" podCreationTimestamp="2025-12-05 23:32:29 +0000 UTC" firstStartedPulling="2025-12-05 23:32:30.248507488 +0000 UTC m=+770.931911754" lastFinishedPulling="2025-12-05 23:32:38.708167516 +0000 UTC m=+779.391571792" observedRunningTime="2025-12-05 23:32:39.43198531 +0000 UTC m=+780.115389586" watchObservedRunningTime="2025-12-05 23:32:39.438067556 +0000 UTC m=+780.121471832" Dec 05 23:32:39 crc kubenswrapper[4734]: I1205 23:32:39.454986 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cz6xv"] Dec 05 23:32:39 crc kubenswrapper[4734]: I1205 23:32:39.459279 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cz6xv"] Dec 05 23:32:39 crc kubenswrapper[4734]: I1205 23:32:39.625097 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40e54803-9b8a-48ff-a4be-9cd955d7031d" path="/var/lib/kubelet/pods/40e54803-9b8a-48ff-a4be-9cd955d7031d/volumes" Dec 05 23:32:50 crc kubenswrapper[4734]: I1205 23:32:50.444624 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:32:50 crc kubenswrapper[4734]: I1205 23:32:50.445612 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:32:50 crc kubenswrapper[4734]: I1205 23:32:50.445684 4734 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 05 23:32:50 crc kubenswrapper[4734]: I1205 23:32:50.446503 4734 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8e69331b125e1151d942b08cb111e9d9d1598a8f70aacd7d59fba49b1cd48af6"} pod="openshift-machine-config-operator/machine-config-daemon-vn94d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 23:32:50 crc kubenswrapper[4734]: I1205 23:32:50.446578 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" containerID="cri-o://8e69331b125e1151d942b08cb111e9d9d1598a8f70aacd7d59fba49b1cd48af6" gracePeriod=600 Dec 05 23:32:50 crc kubenswrapper[4734]: I1205 23:32:50.671943 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-8688474b6d-2dhr7" Dec 05 23:32:51 crc kubenswrapper[4734]: I1205 23:32:51.469994 4734 generic.go:334] "Generic (PLEG): container finished" podID="65758270-a7a7-46b5-af95-0588daf9fa86" containerID="8e69331b125e1151d942b08cb111e9d9d1598a8f70aacd7d59fba49b1cd48af6" exitCode=0 Dec 05 23:32:51 crc kubenswrapper[4734]: I1205 23:32:51.470065 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerDied","Data":"8e69331b125e1151d942b08cb111e9d9d1598a8f70aacd7d59fba49b1cd48af6"} Dec 05 23:32:51 crc kubenswrapper[4734]: I1205 23:32:51.470488 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerStarted","Data":"ce94e8a7ce0afa1b302b0a1993b5d90206c505bc6302ab5507859a6eab1dd7e0"} Dec 05 23:32:51 crc kubenswrapper[4734]: I1205 23:32:51.470550 4734 scope.go:117] "RemoveContainer" containerID="22bacdafd40b9938599de212c005778a6f3d95d2f7f54005c1b60a6e84bd1a7b" Dec 05 23:33:09 crc kubenswrapper[4734]: I1205 23:33:09.819243 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5b69785d4f-gksx6" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.662589 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-g87xk"] Dec 05 23:33:10 crc kubenswrapper[4734]: E1205 23:33:10.662963 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40e54803-9b8a-48ff-a4be-9cd955d7031d" containerName="registry-server" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.662981 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="40e54803-9b8a-48ff-a4be-9cd955d7031d" containerName="registry-server" Dec 05 23:33:10 crc kubenswrapper[4734]: E1205 23:33:10.663004 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40e54803-9b8a-48ff-a4be-9cd955d7031d" containerName="extract-utilities" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.663013 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="40e54803-9b8a-48ff-a4be-9cd955d7031d" containerName="extract-utilities" Dec 05 23:33:10 crc kubenswrapper[4734]: E1205 23:33:10.663025 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40e54803-9b8a-48ff-a4be-9cd955d7031d" containerName="extract-content" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.663034 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="40e54803-9b8a-48ff-a4be-9cd955d7031d" containerName="extract-content" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.663173 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="40e54803-9b8a-48ff-a4be-9cd955d7031d" containerName="registry-server" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.663774 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-g87xk" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.666961 4734 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-bl27v" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.667038 4734 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.674800 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-klfrp"] Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.677727 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-klfrp" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.679476 4734 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.680656 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.684305 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-g87xk"] Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.748437 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tblp\" (UniqueName: \"kubernetes.io/projected/b6fc283a-61b9-4920-90d7-2636375a958b-kube-api-access-9tblp\") pod \"frr-k8s-klfrp\" (UID: \"b6fc283a-61b9-4920-90d7-2636375a958b\") " pod="metallb-system/frr-k8s-klfrp" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.748553 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b6fc283a-61b9-4920-90d7-2636375a958b-frr-startup\") pod \"frr-k8s-klfrp\" (UID: \"b6fc283a-61b9-4920-90d7-2636375a958b\") " pod="metallb-system/frr-k8s-klfrp" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.748608 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh94z\" (UniqueName: \"kubernetes.io/projected/80152489-1b48-4b06-8684-983081b45f88-kube-api-access-dh94z\") pod \"frr-k8s-webhook-server-7fcb986d4-g87xk\" (UID: \"80152489-1b48-4b06-8684-983081b45f88\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-g87xk" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.748646 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6fc283a-61b9-4920-90d7-2636375a958b-metrics-certs\") pod \"frr-k8s-klfrp\" (UID: \"b6fc283a-61b9-4920-90d7-2636375a958b\") " pod="metallb-system/frr-k8s-klfrp" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.748768 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b6fc283a-61b9-4920-90d7-2636375a958b-reloader\") pod \"frr-k8s-klfrp\" (UID: \"b6fc283a-61b9-4920-90d7-2636375a958b\") " pod="metallb-system/frr-k8s-klfrp" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.748830 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b6fc283a-61b9-4920-90d7-2636375a958b-frr-sockets\") pod \"frr-k8s-klfrp\" (UID: \"b6fc283a-61b9-4920-90d7-2636375a958b\") " pod="metallb-system/frr-k8s-klfrp" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.748892 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b6fc283a-61b9-4920-90d7-2636375a958b-metrics\") pod \"frr-k8s-klfrp\" (UID: \"b6fc283a-61b9-4920-90d7-2636375a958b\") " pod="metallb-system/frr-k8s-klfrp" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.748980 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b6fc283a-61b9-4920-90d7-2636375a958b-frr-conf\") pod \"frr-k8s-klfrp\" (UID: \"b6fc283a-61b9-4920-90d7-2636375a958b\") " pod="metallb-system/frr-k8s-klfrp" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.749001 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80152489-1b48-4b06-8684-983081b45f88-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-g87xk\" (UID: \"80152489-1b48-4b06-8684-983081b45f88\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-g87xk" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.811768 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-csdv2"] Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.813356 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-csdv2" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.818764 4734 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-fsclg" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.819021 4734 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.819279 4734 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.820179 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.830335 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-cpg9k"] Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.831644 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-cpg9k" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.837879 4734 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.839718 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-cpg9k"] Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.852634 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b6fc283a-61b9-4920-90d7-2636375a958b-reloader\") pod \"frr-k8s-klfrp\" (UID: \"b6fc283a-61b9-4920-90d7-2636375a958b\") " pod="metallb-system/frr-k8s-klfrp" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.852710 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b6fc283a-61b9-4920-90d7-2636375a958b-frr-sockets\") pod \"frr-k8s-klfrp\" (UID: \"b6fc283a-61b9-4920-90d7-2636375a958b\") " pod="metallb-system/frr-k8s-klfrp" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.852765 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b07126f-ef86-48d5-b597-56782b518f5e-metrics-certs\") pod \"speaker-csdv2\" (UID: \"0b07126f-ef86-48d5-b597-56782b518f5e\") " pod="metallb-system/speaker-csdv2" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.852830 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b6fc283a-61b9-4920-90d7-2636375a958b-metrics\") pod \"frr-k8s-klfrp\" (UID: \"b6fc283a-61b9-4920-90d7-2636375a958b\") " pod="metallb-system/frr-k8s-klfrp" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.852921 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b6fc283a-61b9-4920-90d7-2636375a958b-frr-conf\") pod \"frr-k8s-klfrp\" (UID: \"b6fc283a-61b9-4920-90d7-2636375a958b\") " pod="metallb-system/frr-k8s-klfrp" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.852947 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80152489-1b48-4b06-8684-983081b45f88-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-g87xk\" (UID: \"80152489-1b48-4b06-8684-983081b45f88\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-g87xk" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.853002 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0b07126f-ef86-48d5-b597-56782b518f5e-memberlist\") pod \"speaker-csdv2\" (UID: \"0b07126f-ef86-48d5-b597-56782b518f5e\") " pod="metallb-system/speaker-csdv2" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.853033 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tblp\" (UniqueName: \"kubernetes.io/projected/b6fc283a-61b9-4920-90d7-2636375a958b-kube-api-access-9tblp\") pod \"frr-k8s-klfrp\" (UID: \"b6fc283a-61b9-4920-90d7-2636375a958b\") " pod="metallb-system/frr-k8s-klfrp" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.853072 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b6fc283a-61b9-4920-90d7-2636375a958b-frr-startup\") pod \"frr-k8s-klfrp\" (UID: \"b6fc283a-61b9-4920-90d7-2636375a958b\") " pod="metallb-system/frr-k8s-klfrp" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.853106 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh94z\" (UniqueName: \"kubernetes.io/projected/80152489-1b48-4b06-8684-983081b45f88-kube-api-access-dh94z\") pod \"frr-k8s-webhook-server-7fcb986d4-g87xk\" (UID: \"80152489-1b48-4b06-8684-983081b45f88\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-g87xk" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.853158 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6fc283a-61b9-4920-90d7-2636375a958b-metrics-certs\") pod \"frr-k8s-klfrp\" (UID: \"b6fc283a-61b9-4920-90d7-2636375a958b\") " pod="metallb-system/frr-k8s-klfrp" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.853240 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zddzr\" (UniqueName: \"kubernetes.io/projected/0b07126f-ef86-48d5-b597-56782b518f5e-kube-api-access-zddzr\") pod \"speaker-csdv2\" (UID: \"0b07126f-ef86-48d5-b597-56782b518f5e\") " pod="metallb-system/speaker-csdv2" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.853263 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0b07126f-ef86-48d5-b597-56782b518f5e-metallb-excludel2\") pod \"speaker-csdv2\" (UID: \"0b07126f-ef86-48d5-b597-56782b518f5e\") " pod="metallb-system/speaker-csdv2" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.853417 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b6fc283a-61b9-4920-90d7-2636375a958b-reloader\") pod \"frr-k8s-klfrp\" (UID: \"b6fc283a-61b9-4920-90d7-2636375a958b\") " pod="metallb-system/frr-k8s-klfrp" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.853472 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b6fc283a-61b9-4920-90d7-2636375a958b-frr-sockets\") pod \"frr-k8s-klfrp\" (UID: \"b6fc283a-61b9-4920-90d7-2636375a958b\") " pod="metallb-system/frr-k8s-klfrp" Dec 05 23:33:10 crc kubenswrapper[4734]: E1205 23:33:10.853902 4734 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.853986 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b6fc283a-61b9-4920-90d7-2636375a958b-metrics\") pod \"frr-k8s-klfrp\" (UID: \"b6fc283a-61b9-4920-90d7-2636375a958b\") " pod="metallb-system/frr-k8s-klfrp" Dec 05 23:33:10 crc kubenswrapper[4734]: E1205 23:33:10.853993 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6fc283a-61b9-4920-90d7-2636375a958b-metrics-certs podName:b6fc283a-61b9-4920-90d7-2636375a958b nodeName:}" failed. No retries permitted until 2025-12-05 23:33:11.353967463 +0000 UTC m=+812.037371739 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b6fc283a-61b9-4920-90d7-2636375a958b-metrics-certs") pod "frr-k8s-klfrp" (UID: "b6fc283a-61b9-4920-90d7-2636375a958b") : secret "frr-k8s-certs-secret" not found Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.854123 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b6fc283a-61b9-4920-90d7-2636375a958b-frr-conf\") pod \"frr-k8s-klfrp\" (UID: \"b6fc283a-61b9-4920-90d7-2636375a958b\") " pod="metallb-system/frr-k8s-klfrp" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.854521 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b6fc283a-61b9-4920-90d7-2636375a958b-frr-startup\") pod \"frr-k8s-klfrp\" (UID: \"b6fc283a-61b9-4920-90d7-2636375a958b\") " pod="metallb-system/frr-k8s-klfrp" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.873567 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80152489-1b48-4b06-8684-983081b45f88-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-g87xk\" (UID: \"80152489-1b48-4b06-8684-983081b45f88\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-g87xk" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.914579 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh94z\" (UniqueName: \"kubernetes.io/projected/80152489-1b48-4b06-8684-983081b45f88-kube-api-access-dh94z\") pod \"frr-k8s-webhook-server-7fcb986d4-g87xk\" (UID: \"80152489-1b48-4b06-8684-983081b45f88\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-g87xk" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.916383 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tblp\" (UniqueName: \"kubernetes.io/projected/b6fc283a-61b9-4920-90d7-2636375a958b-kube-api-access-9tblp\") pod \"frr-k8s-klfrp\" (UID: \"b6fc283a-61b9-4920-90d7-2636375a958b\") " pod="metallb-system/frr-k8s-klfrp" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.954284 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zddzr\" (UniqueName: \"kubernetes.io/projected/0b07126f-ef86-48d5-b597-56782b518f5e-kube-api-access-zddzr\") pod \"speaker-csdv2\" (UID: \"0b07126f-ef86-48d5-b597-56782b518f5e\") " pod="metallb-system/speaker-csdv2" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.954326 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0b07126f-ef86-48d5-b597-56782b518f5e-metallb-excludel2\") pod \"speaker-csdv2\" (UID: \"0b07126f-ef86-48d5-b597-56782b518f5e\") " pod="metallb-system/speaker-csdv2" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.954349 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac2c5d10-25e3-4d0e-9632-ee5701c15e7e-metrics-certs\") pod \"controller-f8648f98b-cpg9k\" (UID: \"ac2c5d10-25e3-4d0e-9632-ee5701c15e7e\") " pod="metallb-system/controller-f8648f98b-cpg9k" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.954371 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac2c5d10-25e3-4d0e-9632-ee5701c15e7e-cert\") pod \"controller-f8648f98b-cpg9k\" (UID: \"ac2c5d10-25e3-4d0e-9632-ee5701c15e7e\") " pod="metallb-system/controller-f8648f98b-cpg9k" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.954397 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b07126f-ef86-48d5-b597-56782b518f5e-metrics-certs\") pod \"speaker-csdv2\" (UID: \"0b07126f-ef86-48d5-b597-56782b518f5e\") " pod="metallb-system/speaker-csdv2" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.954436 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0b07126f-ef86-48d5-b597-56782b518f5e-memberlist\") pod \"speaker-csdv2\" (UID: \"0b07126f-ef86-48d5-b597-56782b518f5e\") " pod="metallb-system/speaker-csdv2" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.954455 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2b9l\" (UniqueName: \"kubernetes.io/projected/ac2c5d10-25e3-4d0e-9632-ee5701c15e7e-kube-api-access-l2b9l\") pod \"controller-f8648f98b-cpg9k\" (UID: \"ac2c5d10-25e3-4d0e-9632-ee5701c15e7e\") " pod="metallb-system/controller-f8648f98b-cpg9k" Dec 05 23:33:10 crc kubenswrapper[4734]: E1205 23:33:10.954820 4734 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 05 23:33:10 crc kubenswrapper[4734]: E1205 23:33:10.954917 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b07126f-ef86-48d5-b597-56782b518f5e-memberlist podName:0b07126f-ef86-48d5-b597-56782b518f5e nodeName:}" failed. No retries permitted until 2025-12-05 23:33:11.454888713 +0000 UTC m=+812.138292989 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/0b07126f-ef86-48d5-b597-56782b518f5e-memberlist") pod "speaker-csdv2" (UID: "0b07126f-ef86-48d5-b597-56782b518f5e") : secret "metallb-memberlist" not found Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.956862 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0b07126f-ef86-48d5-b597-56782b518f5e-metallb-excludel2\") pod \"speaker-csdv2\" (UID: \"0b07126f-ef86-48d5-b597-56782b518f5e\") " pod="metallb-system/speaker-csdv2" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.958166 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b07126f-ef86-48d5-b597-56782b518f5e-metrics-certs\") pod \"speaker-csdv2\" (UID: \"0b07126f-ef86-48d5-b597-56782b518f5e\") " pod="metallb-system/speaker-csdv2" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.980217 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zddzr\" (UniqueName: \"kubernetes.io/projected/0b07126f-ef86-48d5-b597-56782b518f5e-kube-api-access-zddzr\") pod \"speaker-csdv2\" (UID: \"0b07126f-ef86-48d5-b597-56782b518f5e\") " pod="metallb-system/speaker-csdv2" Dec 05 23:33:10 crc kubenswrapper[4734]: I1205 23:33:10.988283 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-g87xk" Dec 05 23:33:11 crc kubenswrapper[4734]: I1205 23:33:11.058388 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac2c5d10-25e3-4d0e-9632-ee5701c15e7e-metrics-certs\") pod \"controller-f8648f98b-cpg9k\" (UID: \"ac2c5d10-25e3-4d0e-9632-ee5701c15e7e\") " pod="metallb-system/controller-f8648f98b-cpg9k" Dec 05 23:33:11 crc kubenswrapper[4734]: I1205 23:33:11.058447 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac2c5d10-25e3-4d0e-9632-ee5701c15e7e-cert\") pod \"controller-f8648f98b-cpg9k\" (UID: \"ac2c5d10-25e3-4d0e-9632-ee5701c15e7e\") " pod="metallb-system/controller-f8648f98b-cpg9k" Dec 05 23:33:11 crc kubenswrapper[4734]: I1205 23:33:11.058551 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2b9l\" (UniqueName: \"kubernetes.io/projected/ac2c5d10-25e3-4d0e-9632-ee5701c15e7e-kube-api-access-l2b9l\") pod \"controller-f8648f98b-cpg9k\" (UID: \"ac2c5d10-25e3-4d0e-9632-ee5701c15e7e\") " pod="metallb-system/controller-f8648f98b-cpg9k" Dec 05 23:33:11 crc kubenswrapper[4734]: I1205 23:33:11.062584 4734 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 05 23:33:11 crc kubenswrapper[4734]: I1205 23:33:11.064123 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac2c5d10-25e3-4d0e-9632-ee5701c15e7e-metrics-certs\") pod \"controller-f8648f98b-cpg9k\" (UID: \"ac2c5d10-25e3-4d0e-9632-ee5701c15e7e\") " pod="metallb-system/controller-f8648f98b-cpg9k" Dec 05 23:33:11 crc kubenswrapper[4734]: I1205 23:33:11.073509 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac2c5d10-25e3-4d0e-9632-ee5701c15e7e-cert\") pod \"controller-f8648f98b-cpg9k\" (UID: \"ac2c5d10-25e3-4d0e-9632-ee5701c15e7e\") " pod="metallb-system/controller-f8648f98b-cpg9k" Dec 05 23:33:11 crc kubenswrapper[4734]: I1205 23:33:11.075841 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2b9l\" (UniqueName: \"kubernetes.io/projected/ac2c5d10-25e3-4d0e-9632-ee5701c15e7e-kube-api-access-l2b9l\") pod \"controller-f8648f98b-cpg9k\" (UID: \"ac2c5d10-25e3-4d0e-9632-ee5701c15e7e\") " pod="metallb-system/controller-f8648f98b-cpg9k" Dec 05 23:33:11 crc kubenswrapper[4734]: I1205 23:33:11.146247 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-cpg9k" Dec 05 23:33:11 crc kubenswrapper[4734]: I1205 23:33:11.183458 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-g87xk"] Dec 05 23:33:11 crc kubenswrapper[4734]: W1205 23:33:11.188594 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80152489_1b48_4b06_8684_983081b45f88.slice/crio-c72b88ab907321adac5ab7c0af75f740d7850a50e59c5928e0dc392bae7eeba5 WatchSource:0}: Error finding container c72b88ab907321adac5ab7c0af75f740d7850a50e59c5928e0dc392bae7eeba5: Status 404 returned error can't find the container with id c72b88ab907321adac5ab7c0af75f740d7850a50e59c5928e0dc392bae7eeba5 Dec 05 23:33:11 crc kubenswrapper[4734]: I1205 23:33:11.364205 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6fc283a-61b9-4920-90d7-2636375a958b-metrics-certs\") pod \"frr-k8s-klfrp\" (UID: \"b6fc283a-61b9-4920-90d7-2636375a958b\") " pod="metallb-system/frr-k8s-klfrp" Dec 05 23:33:11 crc kubenswrapper[4734]: I1205 23:33:11.368610 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6fc283a-61b9-4920-90d7-2636375a958b-metrics-certs\") pod \"frr-k8s-klfrp\" (UID: \"b6fc283a-61b9-4920-90d7-2636375a958b\") " pod="metallb-system/frr-k8s-klfrp" Dec 05 23:33:11 crc kubenswrapper[4734]: I1205 23:33:11.465982 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0b07126f-ef86-48d5-b597-56782b518f5e-memberlist\") pod \"speaker-csdv2\" (UID: \"0b07126f-ef86-48d5-b597-56782b518f5e\") " pod="metallb-system/speaker-csdv2" Dec 05 23:33:11 crc kubenswrapper[4734]: E1205 23:33:11.467754 4734 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 05 23:33:11 crc kubenswrapper[4734]: E1205 23:33:11.467914 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b07126f-ef86-48d5-b597-56782b518f5e-memberlist podName:0b07126f-ef86-48d5-b597-56782b518f5e nodeName:}" failed. No retries permitted until 2025-12-05 23:33:12.467881809 +0000 UTC m=+813.151286085 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/0b07126f-ef86-48d5-b597-56782b518f5e-memberlist") pod "speaker-csdv2" (UID: "0b07126f-ef86-48d5-b597-56782b518f5e") : secret "metallb-memberlist" not found Dec 05 23:33:11 crc kubenswrapper[4734]: I1205 23:33:11.573182 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-cpg9k"] Dec 05 23:33:11 crc kubenswrapper[4734]: W1205 23:33:11.578756 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac2c5d10_25e3_4d0e_9632_ee5701c15e7e.slice/crio-153c875b9ef901c905d352056c165e7901950e609ae582ad723daacbe62680f6 WatchSource:0}: Error finding container 153c875b9ef901c905d352056c165e7901950e609ae582ad723daacbe62680f6: Status 404 returned error can't find the container with id 153c875b9ef901c905d352056c165e7901950e609ae582ad723daacbe62680f6 Dec 05 23:33:11 crc kubenswrapper[4734]: I1205 23:33:11.604094 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-klfrp" Dec 05 23:33:11 crc kubenswrapper[4734]: I1205 23:33:11.634086 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-cpg9k" event={"ID":"ac2c5d10-25e3-4d0e-9632-ee5701c15e7e","Type":"ContainerStarted","Data":"153c875b9ef901c905d352056c165e7901950e609ae582ad723daacbe62680f6"} Dec 05 23:33:11 crc kubenswrapper[4734]: I1205 23:33:11.635621 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-g87xk" event={"ID":"80152489-1b48-4b06-8684-983081b45f88","Type":"ContainerStarted","Data":"c72b88ab907321adac5ab7c0af75f740d7850a50e59c5928e0dc392bae7eeba5"} Dec 05 23:33:12 crc kubenswrapper[4734]: I1205 23:33:12.481435 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0b07126f-ef86-48d5-b597-56782b518f5e-memberlist\") pod \"speaker-csdv2\" (UID: \"0b07126f-ef86-48d5-b597-56782b518f5e\") " pod="metallb-system/speaker-csdv2" Dec 05 23:33:12 crc kubenswrapper[4734]: I1205 23:33:12.487547 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0b07126f-ef86-48d5-b597-56782b518f5e-memberlist\") pod \"speaker-csdv2\" (UID: \"0b07126f-ef86-48d5-b597-56782b518f5e\") " pod="metallb-system/speaker-csdv2" Dec 05 23:33:12 crc kubenswrapper[4734]: I1205 23:33:12.631704 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-csdv2" Dec 05 23:33:12 crc kubenswrapper[4734]: I1205 23:33:12.647244 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-klfrp" event={"ID":"b6fc283a-61b9-4920-90d7-2636375a958b","Type":"ContainerStarted","Data":"5bd15d816b0ef0f9d50945214b24d599cc9295739b3ee608f8ac858511fee143"} Dec 05 23:33:12 crc kubenswrapper[4734]: I1205 23:33:12.658858 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-cpg9k" event={"ID":"ac2c5d10-25e3-4d0e-9632-ee5701c15e7e","Type":"ContainerStarted","Data":"1f7745fd8bae3bca30196b6d0204705d0285c72d4507700bd52d051c1b7b0e16"} Dec 05 23:33:12 crc kubenswrapper[4734]: I1205 23:33:12.658902 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-cpg9k" event={"ID":"ac2c5d10-25e3-4d0e-9632-ee5701c15e7e","Type":"ContainerStarted","Data":"9ee641233e27727257a6cacfe196afc62e720af6f81ff5b36849d19a76f3d81c"} Dec 05 23:33:12 crc kubenswrapper[4734]: I1205 23:33:12.659184 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-cpg9k" Dec 05 23:33:12 crc kubenswrapper[4734]: W1205 23:33:12.662290 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b07126f_ef86_48d5_b597_56782b518f5e.slice/crio-6dc972883c93093618366fb0a783693d7e505f71401f625e944fb47b5b2edba5 WatchSource:0}: Error finding container 6dc972883c93093618366fb0a783693d7e505f71401f625e944fb47b5b2edba5: Status 404 returned error can't find the container with id 6dc972883c93093618366fb0a783693d7e505f71401f625e944fb47b5b2edba5 Dec 05 23:33:12 crc kubenswrapper[4734]: I1205 23:33:12.683746 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-cpg9k" podStartSLOduration=2.683716578 podStartE2EDuration="2.683716578s" podCreationTimestamp="2025-12-05 23:33:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:33:12.679071546 +0000 UTC m=+813.362475832" watchObservedRunningTime="2025-12-05 23:33:12.683716578 +0000 UTC m=+813.367120864" Dec 05 23:33:13 crc kubenswrapper[4734]: I1205 23:33:13.691339 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-csdv2" event={"ID":"0b07126f-ef86-48d5-b597-56782b518f5e","Type":"ContainerStarted","Data":"04f6855faf7acbc63b058b2d58582793452361e8da8d7275749c40a84930e9ba"} Dec 05 23:33:13 crc kubenswrapper[4734]: I1205 23:33:13.691387 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-csdv2" event={"ID":"0b07126f-ef86-48d5-b597-56782b518f5e","Type":"ContainerStarted","Data":"c7e08176bef13142296d9f6c8a05a033d19963eb4b0f842f91651fed940dcd08"} Dec 05 23:33:13 crc kubenswrapper[4734]: I1205 23:33:13.691400 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-csdv2" event={"ID":"0b07126f-ef86-48d5-b597-56782b518f5e","Type":"ContainerStarted","Data":"6dc972883c93093618366fb0a783693d7e505f71401f625e944fb47b5b2edba5"} Dec 05 23:33:13 crc kubenswrapper[4734]: I1205 23:33:13.691708 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-csdv2" Dec 05 23:33:13 crc kubenswrapper[4734]: I1205 23:33:13.716948 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-csdv2" podStartSLOduration=3.716926442 podStartE2EDuration="3.716926442s" podCreationTimestamp="2025-12-05 23:33:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:33:13.710472115 +0000 UTC m=+814.393876391" watchObservedRunningTime="2025-12-05 23:33:13.716926442 +0000 UTC m=+814.400330718" Dec 05 23:33:20 crc kubenswrapper[4734]: I1205 23:33:20.756794 4734 generic.go:334] "Generic (PLEG): container finished" podID="b6fc283a-61b9-4920-90d7-2636375a958b" containerID="140e62e601a303fe79db7f1ec90be797bb5a663fcd010d2d638c1b9d5766459a" exitCode=0 Dec 05 23:33:20 crc kubenswrapper[4734]: I1205 23:33:20.756883 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-klfrp" event={"ID":"b6fc283a-61b9-4920-90d7-2636375a958b","Type":"ContainerDied","Data":"140e62e601a303fe79db7f1ec90be797bb5a663fcd010d2d638c1b9d5766459a"} Dec 05 23:33:20 crc kubenswrapper[4734]: I1205 23:33:20.762013 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-g87xk" event={"ID":"80152489-1b48-4b06-8684-983081b45f88","Type":"ContainerStarted","Data":"f09729680ca7cff7a01f3845c447dcbbf16498497cd87337a980f822e4db9c1d"} Dec 05 23:33:20 crc kubenswrapper[4734]: I1205 23:33:20.762362 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-g87xk" Dec 05 23:33:21 crc kubenswrapper[4734]: I1205 23:33:21.151660 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-cpg9k" Dec 05 23:33:21 crc kubenswrapper[4734]: I1205 23:33:21.183486 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-g87xk" podStartSLOduration=2.691699251 podStartE2EDuration="11.183464568s" podCreationTimestamp="2025-12-05 23:33:10 +0000 UTC" firstStartedPulling="2025-12-05 23:33:11.191600947 +0000 UTC m=+811.875005223" lastFinishedPulling="2025-12-05 23:33:19.683366264 +0000 UTC m=+820.366770540" observedRunningTime="2025-12-05 23:33:20.812092747 +0000 UTC m=+821.495497053" watchObservedRunningTime="2025-12-05 23:33:21.183464568 +0000 UTC m=+821.866868844" Dec 05 23:33:21 crc kubenswrapper[4734]: I1205 23:33:21.772233 4734 generic.go:334] "Generic (PLEG): container finished" podID="b6fc283a-61b9-4920-90d7-2636375a958b" containerID="9fd17c83d3c2c8578dafcf0e1b56cf086a61f18ef256de837050837a3ad78a39" exitCode=0 Dec 05 23:33:21 crc kubenswrapper[4734]: I1205 23:33:21.772351 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-klfrp" event={"ID":"b6fc283a-61b9-4920-90d7-2636375a958b","Type":"ContainerDied","Data":"9fd17c83d3c2c8578dafcf0e1b56cf086a61f18ef256de837050837a3ad78a39"} Dec 05 23:33:22 crc kubenswrapper[4734]: I1205 23:33:22.639243 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-csdv2" Dec 05 23:33:22 crc kubenswrapper[4734]: I1205 23:33:22.781001 4734 generic.go:334] "Generic (PLEG): container finished" podID="b6fc283a-61b9-4920-90d7-2636375a958b" containerID="1e9b85bde1cd600ee1447e98e8d89df9b2066bd0fd01edfda235e944c0895c00" exitCode=0 Dec 05 23:33:22 crc kubenswrapper[4734]: I1205 23:33:22.781064 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-klfrp" event={"ID":"b6fc283a-61b9-4920-90d7-2636375a958b","Type":"ContainerDied","Data":"1e9b85bde1cd600ee1447e98e8d89df9b2066bd0fd01edfda235e944c0895c00"} Dec 05 23:33:23 crc kubenswrapper[4734]: I1205 23:33:23.797482 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-klfrp" event={"ID":"b6fc283a-61b9-4920-90d7-2636375a958b","Type":"ContainerStarted","Data":"f5a516bd1f2b2a147b464ada9a8e855101afb8193cae47bbcc77f8506641fcad"} Dec 05 23:33:23 crc kubenswrapper[4734]: I1205 23:33:23.797968 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-klfrp" event={"ID":"b6fc283a-61b9-4920-90d7-2636375a958b","Type":"ContainerStarted","Data":"558de5ff991c93becf985127628b01a9f914483304e638366c601d11faeb930b"} Dec 05 23:33:23 crc kubenswrapper[4734]: I1205 23:33:23.797991 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-klfrp" event={"ID":"b6fc283a-61b9-4920-90d7-2636375a958b","Type":"ContainerStarted","Data":"3aa10a2e174b75f74ebd4f96a01536f649d6b6677873dcbb06d8cabee86ac8a5"} Dec 05 23:33:23 crc kubenswrapper[4734]: I1205 23:33:23.798011 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-klfrp" event={"ID":"b6fc283a-61b9-4920-90d7-2636375a958b","Type":"ContainerStarted","Data":"cb41133d63c7fbc6f92ef2f3f8f30a8a559a4a10e65ba7563988e9ae22a90e34"} Dec 05 23:33:24 crc kubenswrapper[4734]: I1205 23:33:24.809138 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-klfrp" event={"ID":"b6fc283a-61b9-4920-90d7-2636375a958b","Type":"ContainerStarted","Data":"11f6504b3603b009e979ae6eee54a5aa92805d5678acf93d1b679591414bc959"} Dec 05 23:33:24 crc kubenswrapper[4734]: I1205 23:33:24.809223 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-klfrp" event={"ID":"b6fc283a-61b9-4920-90d7-2636375a958b","Type":"ContainerStarted","Data":"6453eeae3080f4c2c59b6c76f932688abc585c8ae514734a7b48cc7981fd898c"} Dec 05 23:33:24 crc kubenswrapper[4734]: I1205 23:33:24.809970 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-klfrp" Dec 05 23:33:24 crc kubenswrapper[4734]: I1205 23:33:24.834046 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-klfrp" podStartSLOduration=6.842785955 podStartE2EDuration="14.834019368s" podCreationTimestamp="2025-12-05 23:33:10 +0000 UTC" firstStartedPulling="2025-12-05 23:33:11.718436887 +0000 UTC m=+812.401841163" lastFinishedPulling="2025-12-05 23:33:19.7096703 +0000 UTC m=+820.393074576" observedRunningTime="2025-12-05 23:33:24.831296303 +0000 UTC m=+825.514700659" watchObservedRunningTime="2025-12-05 23:33:24.834019368 +0000 UTC m=+825.517423664" Dec 05 23:33:25 crc kubenswrapper[4734]: I1205 23:33:25.608718 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-vfvnt"] Dec 05 23:33:25 crc kubenswrapper[4734]: I1205 23:33:25.609574 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vfvnt" Dec 05 23:33:25 crc kubenswrapper[4734]: I1205 23:33:25.613500 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-xltth" Dec 05 23:33:25 crc kubenswrapper[4734]: I1205 23:33:25.613849 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 05 23:33:25 crc kubenswrapper[4734]: I1205 23:33:25.614314 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 05 23:33:25 crc kubenswrapper[4734]: I1205 23:33:25.638842 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-vfvnt"] Dec 05 23:33:25 crc kubenswrapper[4734]: I1205 23:33:25.707154 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpmkt\" (UniqueName: \"kubernetes.io/projected/bf2ea490-afd1-41e0-b8a7-c5960f6ea26c-kube-api-access-bpmkt\") pod \"openstack-operator-index-vfvnt\" (UID: \"bf2ea490-afd1-41e0-b8a7-c5960f6ea26c\") " pod="openstack-operators/openstack-operator-index-vfvnt" Dec 05 23:33:25 crc kubenswrapper[4734]: I1205 23:33:25.808347 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpmkt\" (UniqueName: \"kubernetes.io/projected/bf2ea490-afd1-41e0-b8a7-c5960f6ea26c-kube-api-access-bpmkt\") pod \"openstack-operator-index-vfvnt\" (UID: \"bf2ea490-afd1-41e0-b8a7-c5960f6ea26c\") " pod="openstack-operators/openstack-operator-index-vfvnt" Dec 05 23:33:25 crc kubenswrapper[4734]: I1205 23:33:25.832682 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpmkt\" (UniqueName: \"kubernetes.io/projected/bf2ea490-afd1-41e0-b8a7-c5960f6ea26c-kube-api-access-bpmkt\") pod \"openstack-operator-index-vfvnt\" (UID: \"bf2ea490-afd1-41e0-b8a7-c5960f6ea26c\") " pod="openstack-operators/openstack-operator-index-vfvnt" Dec 05 23:33:25 crc kubenswrapper[4734]: I1205 23:33:25.929433 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vfvnt" Dec 05 23:33:26 crc kubenswrapper[4734]: I1205 23:33:26.156733 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-vfvnt"] Dec 05 23:33:26 crc kubenswrapper[4734]: W1205 23:33:26.161949 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf2ea490_afd1_41e0_b8a7_c5960f6ea26c.slice/crio-c9aaf40b4145450d76fb22299d27ce8efdb39dcc5cdc95628f1343f742b3b228 WatchSource:0}: Error finding container c9aaf40b4145450d76fb22299d27ce8efdb39dcc5cdc95628f1343f742b3b228: Status 404 returned error can't find the container with id c9aaf40b4145450d76fb22299d27ce8efdb39dcc5cdc95628f1343f742b3b228 Dec 05 23:33:26 crc kubenswrapper[4734]: I1205 23:33:26.604741 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-klfrp" Dec 05 23:33:26 crc kubenswrapper[4734]: I1205 23:33:26.653245 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-klfrp" Dec 05 23:33:26 crc kubenswrapper[4734]: I1205 23:33:26.823161 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vfvnt" event={"ID":"bf2ea490-afd1-41e0-b8a7-c5960f6ea26c","Type":"ContainerStarted","Data":"c9aaf40b4145450d76fb22299d27ce8efdb39dcc5cdc95628f1343f742b3b228"} Dec 05 23:33:28 crc kubenswrapper[4734]: I1205 23:33:28.978186 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-vfvnt"] Dec 05 23:33:29 crc kubenswrapper[4734]: I1205 23:33:29.583097 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-c7wgh"] Dec 05 23:33:29 crc kubenswrapper[4734]: I1205 23:33:29.584644 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-c7wgh" Dec 05 23:33:29 crc kubenswrapper[4734]: I1205 23:33:29.591653 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-c7wgh"] Dec 05 23:33:29 crc kubenswrapper[4734]: I1205 23:33:29.667579 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6rfl\" (UniqueName: \"kubernetes.io/projected/99bde61f-d552-4013-b4fc-eb55e428f53b-kube-api-access-q6rfl\") pod \"openstack-operator-index-c7wgh\" (UID: \"99bde61f-d552-4013-b4fc-eb55e428f53b\") " pod="openstack-operators/openstack-operator-index-c7wgh" Dec 05 23:33:29 crc kubenswrapper[4734]: I1205 23:33:29.771130 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6rfl\" (UniqueName: \"kubernetes.io/projected/99bde61f-d552-4013-b4fc-eb55e428f53b-kube-api-access-q6rfl\") pod \"openstack-operator-index-c7wgh\" (UID: \"99bde61f-d552-4013-b4fc-eb55e428f53b\") " pod="openstack-operators/openstack-operator-index-c7wgh" Dec 05 23:33:29 crc kubenswrapper[4734]: I1205 23:33:29.808899 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6rfl\" (UniqueName: \"kubernetes.io/projected/99bde61f-d552-4013-b4fc-eb55e428f53b-kube-api-access-q6rfl\") pod \"openstack-operator-index-c7wgh\" (UID: \"99bde61f-d552-4013-b4fc-eb55e428f53b\") " pod="openstack-operators/openstack-operator-index-c7wgh" Dec 05 23:33:29 crc kubenswrapper[4734]: I1205 23:33:29.852831 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vfvnt" event={"ID":"bf2ea490-afd1-41e0-b8a7-c5960f6ea26c","Type":"ContainerStarted","Data":"3aea987dd6bf3ed211178f5d13255c0fd51f5a4545f3439117a9e59abd79cee3"} Dec 05 23:33:29 crc kubenswrapper[4734]: I1205 23:33:29.877450 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-vfvnt" podStartSLOduration=2.297478557 podStartE2EDuration="4.877420421s" podCreationTimestamp="2025-12-05 23:33:25 +0000 UTC" firstStartedPulling="2025-12-05 23:33:26.166013447 +0000 UTC m=+826.849417723" lastFinishedPulling="2025-12-05 23:33:28.745955281 +0000 UTC m=+829.429359587" observedRunningTime="2025-12-05 23:33:29.871059686 +0000 UTC m=+830.554463962" watchObservedRunningTime="2025-12-05 23:33:29.877420421 +0000 UTC m=+830.560824727" Dec 05 23:33:29 crc kubenswrapper[4734]: I1205 23:33:29.950754 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-c7wgh" Dec 05 23:33:30 crc kubenswrapper[4734]: I1205 23:33:30.400587 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-c7wgh"] Dec 05 23:33:30 crc kubenswrapper[4734]: W1205 23:33:30.416335 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99bde61f_d552_4013_b4fc_eb55e428f53b.slice/crio-ceaa713214bf684abb7c51c6cb91a759c29d47e95569bb984b8ea65a929ba50a WatchSource:0}: Error finding container ceaa713214bf684abb7c51c6cb91a759c29d47e95569bb984b8ea65a929ba50a: Status 404 returned error can't find the container with id ceaa713214bf684abb7c51c6cb91a759c29d47e95569bb984b8ea65a929ba50a Dec 05 23:33:30 crc kubenswrapper[4734]: I1205 23:33:30.864865 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-c7wgh" event={"ID":"99bde61f-d552-4013-b4fc-eb55e428f53b","Type":"ContainerStarted","Data":"b3e4ccf4c563f4c854f0c260b4c5f120f2fe37c1b7fae16086a3aec96217949d"} Dec 05 23:33:30 crc kubenswrapper[4734]: I1205 23:33:30.864922 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-c7wgh" event={"ID":"99bde61f-d552-4013-b4fc-eb55e428f53b","Type":"ContainerStarted","Data":"ceaa713214bf684abb7c51c6cb91a759c29d47e95569bb984b8ea65a929ba50a"} Dec 05 23:33:30 crc kubenswrapper[4734]: I1205 23:33:30.865054 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-vfvnt" podUID="bf2ea490-afd1-41e0-b8a7-c5960f6ea26c" containerName="registry-server" containerID="cri-o://3aea987dd6bf3ed211178f5d13255c0fd51f5a4545f3439117a9e59abd79cee3" gracePeriod=2 Dec 05 23:33:30 crc kubenswrapper[4734]: I1205 23:33:30.888306 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-c7wgh" podStartSLOduration=1.7966912590000002 podStartE2EDuration="1.888253963s" podCreationTimestamp="2025-12-05 23:33:29 +0000 UTC" firstStartedPulling="2025-12-05 23:33:30.421888616 +0000 UTC m=+831.105292902" lastFinishedPulling="2025-12-05 23:33:30.51345131 +0000 UTC m=+831.196855606" observedRunningTime="2025-12-05 23:33:30.887137006 +0000 UTC m=+831.570541322" watchObservedRunningTime="2025-12-05 23:33:30.888253963 +0000 UTC m=+831.571658249" Dec 05 23:33:30 crc kubenswrapper[4734]: I1205 23:33:30.994351 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-g87xk" Dec 05 23:33:31 crc kubenswrapper[4734]: I1205 23:33:31.308718 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vfvnt" Dec 05 23:33:31 crc kubenswrapper[4734]: I1205 23:33:31.395579 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpmkt\" (UniqueName: \"kubernetes.io/projected/bf2ea490-afd1-41e0-b8a7-c5960f6ea26c-kube-api-access-bpmkt\") pod \"bf2ea490-afd1-41e0-b8a7-c5960f6ea26c\" (UID: \"bf2ea490-afd1-41e0-b8a7-c5960f6ea26c\") " Dec 05 23:33:31 crc kubenswrapper[4734]: I1205 23:33:31.402832 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf2ea490-afd1-41e0-b8a7-c5960f6ea26c-kube-api-access-bpmkt" (OuterVolumeSpecName: "kube-api-access-bpmkt") pod "bf2ea490-afd1-41e0-b8a7-c5960f6ea26c" (UID: "bf2ea490-afd1-41e0-b8a7-c5960f6ea26c"). InnerVolumeSpecName "kube-api-access-bpmkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:33:31 crc kubenswrapper[4734]: I1205 23:33:31.498295 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpmkt\" (UniqueName: \"kubernetes.io/projected/bf2ea490-afd1-41e0-b8a7-c5960f6ea26c-kube-api-access-bpmkt\") on node \"crc\" DevicePath \"\"" Dec 05 23:33:31 crc kubenswrapper[4734]: I1205 23:33:31.876253 4734 generic.go:334] "Generic (PLEG): container finished" podID="bf2ea490-afd1-41e0-b8a7-c5960f6ea26c" containerID="3aea987dd6bf3ed211178f5d13255c0fd51f5a4545f3439117a9e59abd79cee3" exitCode=0 Dec 05 23:33:31 crc kubenswrapper[4734]: I1205 23:33:31.876310 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vfvnt" event={"ID":"bf2ea490-afd1-41e0-b8a7-c5960f6ea26c","Type":"ContainerDied","Data":"3aea987dd6bf3ed211178f5d13255c0fd51f5a4545f3439117a9e59abd79cee3"} Dec 05 23:33:31 crc kubenswrapper[4734]: I1205 23:33:31.876783 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vfvnt" event={"ID":"bf2ea490-afd1-41e0-b8a7-c5960f6ea26c","Type":"ContainerDied","Data":"c9aaf40b4145450d76fb22299d27ce8efdb39dcc5cdc95628f1343f742b3b228"} Dec 05 23:33:31 crc kubenswrapper[4734]: I1205 23:33:31.876409 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vfvnt" Dec 05 23:33:31 crc kubenswrapper[4734]: I1205 23:33:31.876888 4734 scope.go:117] "RemoveContainer" containerID="3aea987dd6bf3ed211178f5d13255c0fd51f5a4545f3439117a9e59abd79cee3" Dec 05 23:33:31 crc kubenswrapper[4734]: I1205 23:33:31.908700 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-vfvnt"] Dec 05 23:33:31 crc kubenswrapper[4734]: I1205 23:33:31.910172 4734 scope.go:117] "RemoveContainer" containerID="3aea987dd6bf3ed211178f5d13255c0fd51f5a4545f3439117a9e59abd79cee3" Dec 05 23:33:31 crc kubenswrapper[4734]: E1205 23:33:31.910880 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aea987dd6bf3ed211178f5d13255c0fd51f5a4545f3439117a9e59abd79cee3\": container with ID starting with 3aea987dd6bf3ed211178f5d13255c0fd51f5a4545f3439117a9e59abd79cee3 not found: ID does not exist" containerID="3aea987dd6bf3ed211178f5d13255c0fd51f5a4545f3439117a9e59abd79cee3" Dec 05 23:33:31 crc kubenswrapper[4734]: I1205 23:33:31.910956 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aea987dd6bf3ed211178f5d13255c0fd51f5a4545f3439117a9e59abd79cee3"} err="failed to get container status \"3aea987dd6bf3ed211178f5d13255c0fd51f5a4545f3439117a9e59abd79cee3\": rpc error: code = NotFound desc = could not find container \"3aea987dd6bf3ed211178f5d13255c0fd51f5a4545f3439117a9e59abd79cee3\": container with ID starting with 3aea987dd6bf3ed211178f5d13255c0fd51f5a4545f3439117a9e59abd79cee3 not found: ID does not exist" Dec 05 23:33:31 crc kubenswrapper[4734]: I1205 23:33:31.913898 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-vfvnt"] Dec 05 23:33:33 crc kubenswrapper[4734]: I1205 23:33:33.624913 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf2ea490-afd1-41e0-b8a7-c5960f6ea26c" path="/var/lib/kubelet/pods/bf2ea490-afd1-41e0-b8a7-c5960f6ea26c/volumes" Dec 05 23:33:39 crc kubenswrapper[4734]: I1205 23:33:39.951737 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-c7wgh" Dec 05 23:33:39 crc kubenswrapper[4734]: I1205 23:33:39.952424 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-c7wgh" Dec 05 23:33:39 crc kubenswrapper[4734]: I1205 23:33:39.996807 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-c7wgh" Dec 05 23:33:40 crc kubenswrapper[4734]: I1205 23:33:40.975701 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-c7wgh" Dec 05 23:33:41 crc kubenswrapper[4734]: I1205 23:33:41.607613 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-klfrp" Dec 05 23:33:47 crc kubenswrapper[4734]: I1205 23:33:47.750364 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x"] Dec 05 23:33:47 crc kubenswrapper[4734]: E1205 23:33:47.751381 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf2ea490-afd1-41e0-b8a7-c5960f6ea26c" containerName="registry-server" Dec 05 23:33:47 crc kubenswrapper[4734]: I1205 23:33:47.751400 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf2ea490-afd1-41e0-b8a7-c5960f6ea26c" containerName="registry-server" Dec 05 23:33:47 crc kubenswrapper[4734]: I1205 23:33:47.751683 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf2ea490-afd1-41e0-b8a7-c5960f6ea26c" containerName="registry-server" Dec 05 23:33:47 crc kubenswrapper[4734]: I1205 23:33:47.752821 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x" Dec 05 23:33:47 crc kubenswrapper[4734]: I1205 23:33:47.755453 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-qwkzx" Dec 05 23:33:47 crc kubenswrapper[4734]: I1205 23:33:47.757944 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x"] Dec 05 23:33:47 crc kubenswrapper[4734]: I1205 23:33:47.842010 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6a5b5d0-ee84-4715-8024-25698133af6b-bundle\") pod \"15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x\" (UID: \"d6a5b5d0-ee84-4715-8024-25698133af6b\") " pod="openstack-operators/15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x" Dec 05 23:33:47 crc kubenswrapper[4734]: I1205 23:33:47.842084 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6a5b5d0-ee84-4715-8024-25698133af6b-util\") pod \"15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x\" (UID: \"d6a5b5d0-ee84-4715-8024-25698133af6b\") " pod="openstack-operators/15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x" Dec 05 23:33:47 crc kubenswrapper[4734]: I1205 23:33:47.842108 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7nsf\" (UniqueName: \"kubernetes.io/projected/d6a5b5d0-ee84-4715-8024-25698133af6b-kube-api-access-w7nsf\") pod \"15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x\" (UID: \"d6a5b5d0-ee84-4715-8024-25698133af6b\") " pod="openstack-operators/15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x" Dec 05 23:33:47 crc kubenswrapper[4734]: I1205 23:33:47.942995 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6a5b5d0-ee84-4715-8024-25698133af6b-util\") pod \"15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x\" (UID: \"d6a5b5d0-ee84-4715-8024-25698133af6b\") " pod="openstack-operators/15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x" Dec 05 23:33:47 crc kubenswrapper[4734]: I1205 23:33:47.943043 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7nsf\" (UniqueName: \"kubernetes.io/projected/d6a5b5d0-ee84-4715-8024-25698133af6b-kube-api-access-w7nsf\") pod \"15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x\" (UID: \"d6a5b5d0-ee84-4715-8024-25698133af6b\") " pod="openstack-operators/15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x" Dec 05 23:33:47 crc kubenswrapper[4734]: I1205 23:33:47.943105 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6a5b5d0-ee84-4715-8024-25698133af6b-bundle\") pod \"15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x\" (UID: \"d6a5b5d0-ee84-4715-8024-25698133af6b\") " pod="openstack-operators/15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x" Dec 05 23:33:47 crc kubenswrapper[4734]: I1205 23:33:47.943657 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6a5b5d0-ee84-4715-8024-25698133af6b-util\") pod \"15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x\" (UID: \"d6a5b5d0-ee84-4715-8024-25698133af6b\") " pod="openstack-operators/15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x" Dec 05 23:33:47 crc kubenswrapper[4734]: I1205 23:33:47.943687 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6a5b5d0-ee84-4715-8024-25698133af6b-bundle\") pod \"15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x\" (UID: \"d6a5b5d0-ee84-4715-8024-25698133af6b\") " pod="openstack-operators/15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x" Dec 05 23:33:47 crc kubenswrapper[4734]: I1205 23:33:47.968149 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7nsf\" (UniqueName: \"kubernetes.io/projected/d6a5b5d0-ee84-4715-8024-25698133af6b-kube-api-access-w7nsf\") pod \"15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x\" (UID: \"d6a5b5d0-ee84-4715-8024-25698133af6b\") " pod="openstack-operators/15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x" Dec 05 23:33:48 crc kubenswrapper[4734]: I1205 23:33:48.073025 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x" Dec 05 23:33:48 crc kubenswrapper[4734]: I1205 23:33:48.290224 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x"] Dec 05 23:33:49 crc kubenswrapper[4734]: I1205 23:33:49.007841 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x" event={"ID":"d6a5b5d0-ee84-4715-8024-25698133af6b","Type":"ContainerStarted","Data":"afa6a41927ab443ac523c4df58db6a08a166443f5d791bc852a862344f965b34"} Dec 05 23:33:50 crc kubenswrapper[4734]: I1205 23:33:50.016909 4734 generic.go:334] "Generic (PLEG): container finished" podID="d6a5b5d0-ee84-4715-8024-25698133af6b" containerID="05b41900d5ca1d271b18b91101ef2b6282050dddfb2c93fc2a9750ff528d482f" exitCode=0 Dec 05 23:33:50 crc kubenswrapper[4734]: I1205 23:33:50.017251 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x" event={"ID":"d6a5b5d0-ee84-4715-8024-25698133af6b","Type":"ContainerDied","Data":"05b41900d5ca1d271b18b91101ef2b6282050dddfb2c93fc2a9750ff528d482f"} Dec 05 23:33:51 crc kubenswrapper[4734]: I1205 23:33:51.033694 4734 generic.go:334] "Generic (PLEG): container finished" podID="d6a5b5d0-ee84-4715-8024-25698133af6b" containerID="492686f519fc640f2aae4b4c28e2f3dc4cb71751f905d046aecdea97667f40a4" exitCode=0 Dec 05 23:33:51 crc kubenswrapper[4734]: I1205 23:33:51.034308 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x" event={"ID":"d6a5b5d0-ee84-4715-8024-25698133af6b","Type":"ContainerDied","Data":"492686f519fc640f2aae4b4c28e2f3dc4cb71751f905d046aecdea97667f40a4"} Dec 05 23:33:52 crc kubenswrapper[4734]: I1205 23:33:52.045609 4734 generic.go:334] "Generic (PLEG): container finished" podID="d6a5b5d0-ee84-4715-8024-25698133af6b" containerID="9b43149328df0f9a7a37c213395147ef6ba1dd191c55b22daf787b9125b0c0b8" exitCode=0 Dec 05 23:33:52 crc kubenswrapper[4734]: I1205 23:33:52.045751 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x" event={"ID":"d6a5b5d0-ee84-4715-8024-25698133af6b","Type":"ContainerDied","Data":"9b43149328df0f9a7a37c213395147ef6ba1dd191c55b22daf787b9125b0c0b8"} Dec 05 23:33:53 crc kubenswrapper[4734]: I1205 23:33:53.372455 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x" Dec 05 23:33:53 crc kubenswrapper[4734]: I1205 23:33:53.540496 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6a5b5d0-ee84-4715-8024-25698133af6b-util\") pod \"d6a5b5d0-ee84-4715-8024-25698133af6b\" (UID: \"d6a5b5d0-ee84-4715-8024-25698133af6b\") " Dec 05 23:33:53 crc kubenswrapper[4734]: I1205 23:33:53.540706 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6a5b5d0-ee84-4715-8024-25698133af6b-bundle\") pod \"d6a5b5d0-ee84-4715-8024-25698133af6b\" (UID: \"d6a5b5d0-ee84-4715-8024-25698133af6b\") " Dec 05 23:33:53 crc kubenswrapper[4734]: I1205 23:33:53.540744 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7nsf\" (UniqueName: \"kubernetes.io/projected/d6a5b5d0-ee84-4715-8024-25698133af6b-kube-api-access-w7nsf\") pod \"d6a5b5d0-ee84-4715-8024-25698133af6b\" (UID: \"d6a5b5d0-ee84-4715-8024-25698133af6b\") " Dec 05 23:33:53 crc kubenswrapper[4734]: I1205 23:33:53.541782 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6a5b5d0-ee84-4715-8024-25698133af6b-bundle" (OuterVolumeSpecName: "bundle") pod "d6a5b5d0-ee84-4715-8024-25698133af6b" (UID: "d6a5b5d0-ee84-4715-8024-25698133af6b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:33:53 crc kubenswrapper[4734]: I1205 23:33:53.548412 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6a5b5d0-ee84-4715-8024-25698133af6b-kube-api-access-w7nsf" (OuterVolumeSpecName: "kube-api-access-w7nsf") pod "d6a5b5d0-ee84-4715-8024-25698133af6b" (UID: "d6a5b5d0-ee84-4715-8024-25698133af6b"). InnerVolumeSpecName "kube-api-access-w7nsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:33:53 crc kubenswrapper[4734]: I1205 23:33:53.560466 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6a5b5d0-ee84-4715-8024-25698133af6b-util" (OuterVolumeSpecName: "util") pod "d6a5b5d0-ee84-4715-8024-25698133af6b" (UID: "d6a5b5d0-ee84-4715-8024-25698133af6b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:33:53 crc kubenswrapper[4734]: I1205 23:33:53.642604 4734 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6a5b5d0-ee84-4715-8024-25698133af6b-util\") on node \"crc\" DevicePath \"\"" Dec 05 23:33:53 crc kubenswrapper[4734]: I1205 23:33:53.642651 4734 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6a5b5d0-ee84-4715-8024-25698133af6b-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:33:53 crc kubenswrapper[4734]: I1205 23:33:53.642671 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7nsf\" (UniqueName: \"kubernetes.io/projected/d6a5b5d0-ee84-4715-8024-25698133af6b-kube-api-access-w7nsf\") on node \"crc\" DevicePath \"\"" Dec 05 23:33:54 crc kubenswrapper[4734]: I1205 23:33:54.067248 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x" event={"ID":"d6a5b5d0-ee84-4715-8024-25698133af6b","Type":"ContainerDied","Data":"afa6a41927ab443ac523c4df58db6a08a166443f5d791bc852a862344f965b34"} Dec 05 23:33:54 crc kubenswrapper[4734]: I1205 23:33:54.067304 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afa6a41927ab443ac523c4df58db6a08a166443f5d791bc852a862344f965b34" Dec 05 23:33:54 crc kubenswrapper[4734]: I1205 23:33:54.067434 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x" Dec 05 23:33:59 crc kubenswrapper[4734]: I1205 23:33:59.732375 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-58b957df85-wbwkx"] Dec 05 23:33:59 crc kubenswrapper[4734]: E1205 23:33:59.733520 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6a5b5d0-ee84-4715-8024-25698133af6b" containerName="util" Dec 05 23:33:59 crc kubenswrapper[4734]: I1205 23:33:59.733554 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a5b5d0-ee84-4715-8024-25698133af6b" containerName="util" Dec 05 23:33:59 crc kubenswrapper[4734]: E1205 23:33:59.733588 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6a5b5d0-ee84-4715-8024-25698133af6b" containerName="pull" Dec 05 23:33:59 crc kubenswrapper[4734]: I1205 23:33:59.733597 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a5b5d0-ee84-4715-8024-25698133af6b" containerName="pull" Dec 05 23:33:59 crc kubenswrapper[4734]: E1205 23:33:59.733622 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6a5b5d0-ee84-4715-8024-25698133af6b" containerName="extract" Dec 05 23:33:59 crc kubenswrapper[4734]: I1205 23:33:59.733631 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a5b5d0-ee84-4715-8024-25698133af6b" containerName="extract" Dec 05 23:33:59 crc kubenswrapper[4734]: I1205 23:33:59.733792 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6a5b5d0-ee84-4715-8024-25698133af6b" containerName="extract" Dec 05 23:33:59 crc kubenswrapper[4734]: I1205 23:33:59.734358 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-58b957df85-wbwkx" Dec 05 23:33:59 crc kubenswrapper[4734]: I1205 23:33:59.738052 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-qbrwq" Dec 05 23:33:59 crc kubenswrapper[4734]: I1205 23:33:59.753806 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-58b957df85-wbwkx"] Dec 05 23:33:59 crc kubenswrapper[4734]: I1205 23:33:59.843514 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxwkl\" (UniqueName: \"kubernetes.io/projected/9ef6c4e9-8341-489c-9f21-ffda1c3ef34a-kube-api-access-nxwkl\") pod \"openstack-operator-controller-operator-58b957df85-wbwkx\" (UID: \"9ef6c4e9-8341-489c-9f21-ffda1c3ef34a\") " pod="openstack-operators/openstack-operator-controller-operator-58b957df85-wbwkx" Dec 05 23:33:59 crc kubenswrapper[4734]: I1205 23:33:59.945823 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxwkl\" (UniqueName: \"kubernetes.io/projected/9ef6c4e9-8341-489c-9f21-ffda1c3ef34a-kube-api-access-nxwkl\") pod \"openstack-operator-controller-operator-58b957df85-wbwkx\" (UID: \"9ef6c4e9-8341-489c-9f21-ffda1c3ef34a\") " pod="openstack-operators/openstack-operator-controller-operator-58b957df85-wbwkx" Dec 05 23:33:59 crc kubenswrapper[4734]: I1205 23:33:59.983999 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxwkl\" (UniqueName: \"kubernetes.io/projected/9ef6c4e9-8341-489c-9f21-ffda1c3ef34a-kube-api-access-nxwkl\") pod \"openstack-operator-controller-operator-58b957df85-wbwkx\" (UID: \"9ef6c4e9-8341-489c-9f21-ffda1c3ef34a\") " pod="openstack-operators/openstack-operator-controller-operator-58b957df85-wbwkx" Dec 05 23:34:00 crc kubenswrapper[4734]: I1205 23:34:00.054017 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-58b957df85-wbwkx" Dec 05 23:34:00 crc kubenswrapper[4734]: I1205 23:34:00.303930 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-58b957df85-wbwkx"] Dec 05 23:34:00 crc kubenswrapper[4734]: W1205 23:34:00.317433 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ef6c4e9_8341_489c_9f21_ffda1c3ef34a.slice/crio-70e86b5786b78559e29eecd419546fee4aff0911134be063c00fe8db915c6aba WatchSource:0}: Error finding container 70e86b5786b78559e29eecd419546fee4aff0911134be063c00fe8db915c6aba: Status 404 returned error can't find the container with id 70e86b5786b78559e29eecd419546fee4aff0911134be063c00fe8db915c6aba Dec 05 23:34:01 crc kubenswrapper[4734]: I1205 23:34:01.149782 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-58b957df85-wbwkx" event={"ID":"9ef6c4e9-8341-489c-9f21-ffda1c3ef34a","Type":"ContainerStarted","Data":"70e86b5786b78559e29eecd419546fee4aff0911134be063c00fe8db915c6aba"} Dec 05 23:34:05 crc kubenswrapper[4734]: I1205 23:34:05.192415 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-58b957df85-wbwkx" event={"ID":"9ef6c4e9-8341-489c-9f21-ffda1c3ef34a","Type":"ContainerStarted","Data":"0ff504697280b53f340099674cb67f85a492d67f3f4a1b8cf8deb33491ab5b24"} Dec 05 23:34:05 crc kubenswrapper[4734]: I1205 23:34:05.193227 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-58b957df85-wbwkx" Dec 05 23:34:05 crc kubenswrapper[4734]: I1205 23:34:05.221615 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-58b957df85-wbwkx" podStartSLOduration=1.7821276419999998 podStartE2EDuration="6.2215897s" podCreationTimestamp="2025-12-05 23:33:59 +0000 UTC" firstStartedPulling="2025-12-05 23:34:00.319900365 +0000 UTC m=+861.003304641" lastFinishedPulling="2025-12-05 23:34:04.759362433 +0000 UTC m=+865.442766699" observedRunningTime="2025-12-05 23:34:05.219580292 +0000 UTC m=+865.902984568" watchObservedRunningTime="2025-12-05 23:34:05.2215897 +0000 UTC m=+865.904993976" Dec 05 23:34:05 crc kubenswrapper[4734]: I1205 23:34:05.627959 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-npq48"] Dec 05 23:34:05 crc kubenswrapper[4734]: I1205 23:34:05.629468 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-npq48" Dec 05 23:34:05 crc kubenswrapper[4734]: I1205 23:34:05.636278 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-npq48"] Dec 05 23:34:05 crc kubenswrapper[4734]: I1205 23:34:05.745075 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr4hr\" (UniqueName: \"kubernetes.io/projected/73bd4d64-2b2a-4c73-87ad-7f24371f685d-kube-api-access-zr4hr\") pod \"redhat-marketplace-npq48\" (UID: \"73bd4d64-2b2a-4c73-87ad-7f24371f685d\") " pod="openshift-marketplace/redhat-marketplace-npq48" Dec 05 23:34:05 crc kubenswrapper[4734]: I1205 23:34:05.745132 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73bd4d64-2b2a-4c73-87ad-7f24371f685d-utilities\") pod \"redhat-marketplace-npq48\" (UID: \"73bd4d64-2b2a-4c73-87ad-7f24371f685d\") " pod="openshift-marketplace/redhat-marketplace-npq48" Dec 05 23:34:05 crc kubenswrapper[4734]: I1205 23:34:05.745343 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73bd4d64-2b2a-4c73-87ad-7f24371f685d-catalog-content\") pod \"redhat-marketplace-npq48\" (UID: \"73bd4d64-2b2a-4c73-87ad-7f24371f685d\") " pod="openshift-marketplace/redhat-marketplace-npq48" Dec 05 23:34:05 crc kubenswrapper[4734]: I1205 23:34:05.846807 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73bd4d64-2b2a-4c73-87ad-7f24371f685d-catalog-content\") pod \"redhat-marketplace-npq48\" (UID: \"73bd4d64-2b2a-4c73-87ad-7f24371f685d\") " pod="openshift-marketplace/redhat-marketplace-npq48" Dec 05 23:34:05 crc kubenswrapper[4734]: I1205 23:34:05.847395 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73bd4d64-2b2a-4c73-87ad-7f24371f685d-catalog-content\") pod \"redhat-marketplace-npq48\" (UID: \"73bd4d64-2b2a-4c73-87ad-7f24371f685d\") " pod="openshift-marketplace/redhat-marketplace-npq48" Dec 05 23:34:05 crc kubenswrapper[4734]: I1205 23:34:05.847807 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr4hr\" (UniqueName: \"kubernetes.io/projected/73bd4d64-2b2a-4c73-87ad-7f24371f685d-kube-api-access-zr4hr\") pod \"redhat-marketplace-npq48\" (UID: \"73bd4d64-2b2a-4c73-87ad-7f24371f685d\") " pod="openshift-marketplace/redhat-marketplace-npq48" Dec 05 23:34:05 crc kubenswrapper[4734]: I1205 23:34:05.847962 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73bd4d64-2b2a-4c73-87ad-7f24371f685d-utilities\") pod \"redhat-marketplace-npq48\" (UID: \"73bd4d64-2b2a-4c73-87ad-7f24371f685d\") " pod="openshift-marketplace/redhat-marketplace-npq48" Dec 05 23:34:05 crc kubenswrapper[4734]: I1205 23:34:05.848595 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73bd4d64-2b2a-4c73-87ad-7f24371f685d-utilities\") pod \"redhat-marketplace-npq48\" (UID: \"73bd4d64-2b2a-4c73-87ad-7f24371f685d\") " pod="openshift-marketplace/redhat-marketplace-npq48" Dec 05 23:34:05 crc kubenswrapper[4734]: I1205 23:34:05.882705 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr4hr\" (UniqueName: \"kubernetes.io/projected/73bd4d64-2b2a-4c73-87ad-7f24371f685d-kube-api-access-zr4hr\") pod \"redhat-marketplace-npq48\" (UID: \"73bd4d64-2b2a-4c73-87ad-7f24371f685d\") " pod="openshift-marketplace/redhat-marketplace-npq48" Dec 05 23:34:05 crc kubenswrapper[4734]: I1205 23:34:05.952110 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-npq48" Dec 05 23:34:06 crc kubenswrapper[4734]: I1205 23:34:06.176216 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-npq48"] Dec 05 23:34:06 crc kubenswrapper[4734]: W1205 23:34:06.185247 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73bd4d64_2b2a_4c73_87ad_7f24371f685d.slice/crio-b8f2dfad0dd84d23a414ad7a020103b39a24ae7cdbed7a3af64480c52fa31b1e WatchSource:0}: Error finding container b8f2dfad0dd84d23a414ad7a020103b39a24ae7cdbed7a3af64480c52fa31b1e: Status 404 returned error can't find the container with id b8f2dfad0dd84d23a414ad7a020103b39a24ae7cdbed7a3af64480c52fa31b1e Dec 05 23:34:06 crc kubenswrapper[4734]: I1205 23:34:06.208491 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-npq48" event={"ID":"73bd4d64-2b2a-4c73-87ad-7f24371f685d","Type":"ContainerStarted","Data":"b8f2dfad0dd84d23a414ad7a020103b39a24ae7cdbed7a3af64480c52fa31b1e"} Dec 05 23:34:07 crc kubenswrapper[4734]: I1205 23:34:07.217599 4734 generic.go:334] "Generic (PLEG): container finished" podID="73bd4d64-2b2a-4c73-87ad-7f24371f685d" containerID="b8b9a4d06e11bfaf78efdd5b8b390b1b1a3b80cfdaea19da622405b3ab07f2f7" exitCode=0 Dec 05 23:34:07 crc kubenswrapper[4734]: I1205 23:34:07.217719 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-npq48" event={"ID":"73bd4d64-2b2a-4c73-87ad-7f24371f685d","Type":"ContainerDied","Data":"b8b9a4d06e11bfaf78efdd5b8b390b1b1a3b80cfdaea19da622405b3ab07f2f7"} Dec 05 23:34:07 crc kubenswrapper[4734]: I1205 23:34:07.624475 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n5tlr"] Dec 05 23:34:07 crc kubenswrapper[4734]: I1205 23:34:07.626543 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n5tlr" Dec 05 23:34:07 crc kubenswrapper[4734]: I1205 23:34:07.632317 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n5tlr"] Dec 05 23:34:07 crc kubenswrapper[4734]: I1205 23:34:07.782583 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6fa2f15-eddf-41a6-9299-8bd85f819a33-catalog-content\") pod \"community-operators-n5tlr\" (UID: \"a6fa2f15-eddf-41a6-9299-8bd85f819a33\") " pod="openshift-marketplace/community-operators-n5tlr" Dec 05 23:34:07 crc kubenswrapper[4734]: I1205 23:34:07.782674 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6fa2f15-eddf-41a6-9299-8bd85f819a33-utilities\") pod \"community-operators-n5tlr\" (UID: \"a6fa2f15-eddf-41a6-9299-8bd85f819a33\") " pod="openshift-marketplace/community-operators-n5tlr" Dec 05 23:34:07 crc kubenswrapper[4734]: I1205 23:34:07.782758 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvhrn\" (UniqueName: \"kubernetes.io/projected/a6fa2f15-eddf-41a6-9299-8bd85f819a33-kube-api-access-xvhrn\") pod \"community-operators-n5tlr\" (UID: \"a6fa2f15-eddf-41a6-9299-8bd85f819a33\") " pod="openshift-marketplace/community-operators-n5tlr" Dec 05 23:34:07 crc kubenswrapper[4734]: I1205 23:34:07.884670 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6fa2f15-eddf-41a6-9299-8bd85f819a33-utilities\") pod \"community-operators-n5tlr\" (UID: \"a6fa2f15-eddf-41a6-9299-8bd85f819a33\") " pod="openshift-marketplace/community-operators-n5tlr" Dec 05 23:34:07 crc kubenswrapper[4734]: I1205 23:34:07.884814 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvhrn\" (UniqueName: \"kubernetes.io/projected/a6fa2f15-eddf-41a6-9299-8bd85f819a33-kube-api-access-xvhrn\") pod \"community-operators-n5tlr\" (UID: \"a6fa2f15-eddf-41a6-9299-8bd85f819a33\") " pod="openshift-marketplace/community-operators-n5tlr" Dec 05 23:34:07 crc kubenswrapper[4734]: I1205 23:34:07.884864 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6fa2f15-eddf-41a6-9299-8bd85f819a33-catalog-content\") pod \"community-operators-n5tlr\" (UID: \"a6fa2f15-eddf-41a6-9299-8bd85f819a33\") " pod="openshift-marketplace/community-operators-n5tlr" Dec 05 23:34:07 crc kubenswrapper[4734]: I1205 23:34:07.885460 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6fa2f15-eddf-41a6-9299-8bd85f819a33-catalog-content\") pod \"community-operators-n5tlr\" (UID: \"a6fa2f15-eddf-41a6-9299-8bd85f819a33\") " pod="openshift-marketplace/community-operators-n5tlr" Dec 05 23:34:07 crc kubenswrapper[4734]: I1205 23:34:07.885602 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6fa2f15-eddf-41a6-9299-8bd85f819a33-utilities\") pod \"community-operators-n5tlr\" (UID: \"a6fa2f15-eddf-41a6-9299-8bd85f819a33\") " pod="openshift-marketplace/community-operators-n5tlr" Dec 05 23:34:07 crc kubenswrapper[4734]: I1205 23:34:07.908089 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvhrn\" (UniqueName: \"kubernetes.io/projected/a6fa2f15-eddf-41a6-9299-8bd85f819a33-kube-api-access-xvhrn\") pod \"community-operators-n5tlr\" (UID: \"a6fa2f15-eddf-41a6-9299-8bd85f819a33\") " pod="openshift-marketplace/community-operators-n5tlr" Dec 05 23:34:07 crc kubenswrapper[4734]: I1205 23:34:07.949712 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n5tlr" Dec 05 23:34:08 crc kubenswrapper[4734]: I1205 23:34:08.225680 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-npq48" event={"ID":"73bd4d64-2b2a-4c73-87ad-7f24371f685d","Type":"ContainerStarted","Data":"89af057ae36aeb638aadbc912da198e809baca50cbbe91df4ee4fead7ca30795"} Dec 05 23:34:08 crc kubenswrapper[4734]: I1205 23:34:08.329397 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n5tlr"] Dec 05 23:34:08 crc kubenswrapper[4734]: W1205 23:34:08.345550 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6fa2f15_eddf_41a6_9299_8bd85f819a33.slice/crio-b4e8533c75ae0aac9a5835c36f22e02a063228f50eccb02519bd06434007fd83 WatchSource:0}: Error finding container b4e8533c75ae0aac9a5835c36f22e02a063228f50eccb02519bd06434007fd83: Status 404 returned error can't find the container with id b4e8533c75ae0aac9a5835c36f22e02a063228f50eccb02519bd06434007fd83 Dec 05 23:34:09 crc kubenswrapper[4734]: I1205 23:34:09.236662 4734 generic.go:334] "Generic (PLEG): container finished" podID="a6fa2f15-eddf-41a6-9299-8bd85f819a33" containerID="dc46da953e00304837889572886e0b2f6d9c5bdb0fbb210842f3f56f4cb4a905" exitCode=0 Dec 05 23:34:09 crc kubenswrapper[4734]: I1205 23:34:09.236785 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n5tlr" event={"ID":"a6fa2f15-eddf-41a6-9299-8bd85f819a33","Type":"ContainerDied","Data":"dc46da953e00304837889572886e0b2f6d9c5bdb0fbb210842f3f56f4cb4a905"} Dec 05 23:34:09 crc kubenswrapper[4734]: I1205 23:34:09.238218 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n5tlr" event={"ID":"a6fa2f15-eddf-41a6-9299-8bd85f819a33","Type":"ContainerStarted","Data":"b4e8533c75ae0aac9a5835c36f22e02a063228f50eccb02519bd06434007fd83"} Dec 05 23:34:09 crc kubenswrapper[4734]: I1205 23:34:09.241367 4734 generic.go:334] "Generic (PLEG): container finished" podID="73bd4d64-2b2a-4c73-87ad-7f24371f685d" containerID="89af057ae36aeb638aadbc912da198e809baca50cbbe91df4ee4fead7ca30795" exitCode=0 Dec 05 23:34:09 crc kubenswrapper[4734]: I1205 23:34:09.241419 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-npq48" event={"ID":"73bd4d64-2b2a-4c73-87ad-7f24371f685d","Type":"ContainerDied","Data":"89af057ae36aeb638aadbc912da198e809baca50cbbe91df4ee4fead7ca30795"} Dec 05 23:34:10 crc kubenswrapper[4734]: I1205 23:34:10.056952 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-58b957df85-wbwkx" Dec 05 23:34:10 crc kubenswrapper[4734]: I1205 23:34:10.248725 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-npq48" event={"ID":"73bd4d64-2b2a-4c73-87ad-7f24371f685d","Type":"ContainerStarted","Data":"40490c53d5e87da4d43858d6746f36a9e2de5aa2bb1481e85aac3cbd9725cabd"} Dec 05 23:34:10 crc kubenswrapper[4734]: I1205 23:34:10.251720 4734 generic.go:334] "Generic (PLEG): container finished" podID="a6fa2f15-eddf-41a6-9299-8bd85f819a33" containerID="e9c8af3e53c63f10f3657ecd63a152cf6e92611f1e469dbdf851f2f00ca9cb6b" exitCode=0 Dec 05 23:34:10 crc kubenswrapper[4734]: I1205 23:34:10.251770 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n5tlr" event={"ID":"a6fa2f15-eddf-41a6-9299-8bd85f819a33","Type":"ContainerDied","Data":"e9c8af3e53c63f10f3657ecd63a152cf6e92611f1e469dbdf851f2f00ca9cb6b"} Dec 05 23:34:10 crc kubenswrapper[4734]: I1205 23:34:10.274118 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-npq48" podStartSLOduration=2.865100902 podStartE2EDuration="5.274055066s" podCreationTimestamp="2025-12-05 23:34:05 +0000 UTC" firstStartedPulling="2025-12-05 23:34:07.220543231 +0000 UTC m=+867.903947517" lastFinishedPulling="2025-12-05 23:34:09.629497405 +0000 UTC m=+870.312901681" observedRunningTime="2025-12-05 23:34:10.271667287 +0000 UTC m=+870.955071563" watchObservedRunningTime="2025-12-05 23:34:10.274055066 +0000 UTC m=+870.957459352" Dec 05 23:34:11 crc kubenswrapper[4734]: I1205 23:34:11.266172 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n5tlr" event={"ID":"a6fa2f15-eddf-41a6-9299-8bd85f819a33","Type":"ContainerStarted","Data":"4753e2e43b50d29e70337333f51a4146f55bbc0c2b1b559ab79eff1fcb2407c0"} Dec 05 23:34:11 crc kubenswrapper[4734]: I1205 23:34:11.298662 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n5tlr" podStartSLOduration=2.894750456 podStartE2EDuration="4.298638094s" podCreationTimestamp="2025-12-05 23:34:07 +0000 UTC" firstStartedPulling="2025-12-05 23:34:09.239577648 +0000 UTC m=+869.922981934" lastFinishedPulling="2025-12-05 23:34:10.643465296 +0000 UTC m=+871.326869572" observedRunningTime="2025-12-05 23:34:11.292073634 +0000 UTC m=+871.975477910" watchObservedRunningTime="2025-12-05 23:34:11.298638094 +0000 UTC m=+871.982042370" Dec 05 23:34:15 crc kubenswrapper[4734]: I1205 23:34:15.953084 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-npq48" Dec 05 23:34:15 crc kubenswrapper[4734]: I1205 23:34:15.955253 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-npq48" Dec 05 23:34:16 crc kubenswrapper[4734]: I1205 23:34:16.020429 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-npq48" Dec 05 23:34:16 crc kubenswrapper[4734]: I1205 23:34:16.349269 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-npq48" Dec 05 23:34:17 crc kubenswrapper[4734]: I1205 23:34:17.951505 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n5tlr" Dec 05 23:34:17 crc kubenswrapper[4734]: I1205 23:34:17.953210 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n5tlr" Dec 05 23:34:18 crc kubenswrapper[4734]: I1205 23:34:18.021738 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n5tlr" Dec 05 23:34:18 crc kubenswrapper[4734]: I1205 23:34:18.364955 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n5tlr" Dec 05 23:34:18 crc kubenswrapper[4734]: I1205 23:34:18.418479 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-npq48"] Dec 05 23:34:19 crc kubenswrapper[4734]: I1205 23:34:19.323784 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-npq48" podUID="73bd4d64-2b2a-4c73-87ad-7f24371f685d" containerName="registry-server" containerID="cri-o://40490c53d5e87da4d43858d6746f36a9e2de5aa2bb1481e85aac3cbd9725cabd" gracePeriod=2 Dec 05 23:34:21 crc kubenswrapper[4734]: I1205 23:34:21.344493 4734 generic.go:334] "Generic (PLEG): container finished" podID="73bd4d64-2b2a-4c73-87ad-7f24371f685d" containerID="40490c53d5e87da4d43858d6746f36a9e2de5aa2bb1481e85aac3cbd9725cabd" exitCode=0 Dec 05 23:34:21 crc kubenswrapper[4734]: I1205 23:34:21.344586 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-npq48" event={"ID":"73bd4d64-2b2a-4c73-87ad-7f24371f685d","Type":"ContainerDied","Data":"40490c53d5e87da4d43858d6746f36a9e2de5aa2bb1481e85aac3cbd9725cabd"} Dec 05 23:34:21 crc kubenswrapper[4734]: I1205 23:34:21.605716 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n5tlr"] Dec 05 23:34:21 crc kubenswrapper[4734]: I1205 23:34:21.606753 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n5tlr" podUID="a6fa2f15-eddf-41a6-9299-8bd85f819a33" containerName="registry-server" containerID="cri-o://4753e2e43b50d29e70337333f51a4146f55bbc0c2b1b559ab79eff1fcb2407c0" gracePeriod=2 Dec 05 23:34:22 crc kubenswrapper[4734]: I1205 23:34:22.364646 4734 generic.go:334] "Generic (PLEG): container finished" podID="a6fa2f15-eddf-41a6-9299-8bd85f819a33" containerID="4753e2e43b50d29e70337333f51a4146f55bbc0c2b1b559ab79eff1fcb2407c0" exitCode=0 Dec 05 23:34:22 crc kubenswrapper[4734]: I1205 23:34:22.364721 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n5tlr" event={"ID":"a6fa2f15-eddf-41a6-9299-8bd85f819a33","Type":"ContainerDied","Data":"4753e2e43b50d29e70337333f51a4146f55bbc0c2b1b559ab79eff1fcb2407c0"} Dec 05 23:34:22 crc kubenswrapper[4734]: I1205 23:34:22.561730 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-npq48" Dec 05 23:34:22 crc kubenswrapper[4734]: I1205 23:34:22.737241 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73bd4d64-2b2a-4c73-87ad-7f24371f685d-catalog-content\") pod \"73bd4d64-2b2a-4c73-87ad-7f24371f685d\" (UID: \"73bd4d64-2b2a-4c73-87ad-7f24371f685d\") " Dec 05 23:34:22 crc kubenswrapper[4734]: I1205 23:34:22.737312 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr4hr\" (UniqueName: \"kubernetes.io/projected/73bd4d64-2b2a-4c73-87ad-7f24371f685d-kube-api-access-zr4hr\") pod \"73bd4d64-2b2a-4c73-87ad-7f24371f685d\" (UID: \"73bd4d64-2b2a-4c73-87ad-7f24371f685d\") " Dec 05 23:34:22 crc kubenswrapper[4734]: I1205 23:34:22.737391 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73bd4d64-2b2a-4c73-87ad-7f24371f685d-utilities\") pod \"73bd4d64-2b2a-4c73-87ad-7f24371f685d\" (UID: \"73bd4d64-2b2a-4c73-87ad-7f24371f685d\") " Dec 05 23:34:22 crc kubenswrapper[4734]: I1205 23:34:22.745776 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73bd4d64-2b2a-4c73-87ad-7f24371f685d-utilities" (OuterVolumeSpecName: "utilities") pod "73bd4d64-2b2a-4c73-87ad-7f24371f685d" (UID: "73bd4d64-2b2a-4c73-87ad-7f24371f685d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:34:22 crc kubenswrapper[4734]: I1205 23:34:22.753024 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73bd4d64-2b2a-4c73-87ad-7f24371f685d-kube-api-access-zr4hr" (OuterVolumeSpecName: "kube-api-access-zr4hr") pod "73bd4d64-2b2a-4c73-87ad-7f24371f685d" (UID: "73bd4d64-2b2a-4c73-87ad-7f24371f685d"). InnerVolumeSpecName "kube-api-access-zr4hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:34:22 crc kubenswrapper[4734]: I1205 23:34:22.767841 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73bd4d64-2b2a-4c73-87ad-7f24371f685d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73bd4d64-2b2a-4c73-87ad-7f24371f685d" (UID: "73bd4d64-2b2a-4c73-87ad-7f24371f685d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:34:22 crc kubenswrapper[4734]: I1205 23:34:22.839464 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73bd4d64-2b2a-4c73-87ad-7f24371f685d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:34:22 crc kubenswrapper[4734]: I1205 23:34:22.839961 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zr4hr\" (UniqueName: \"kubernetes.io/projected/73bd4d64-2b2a-4c73-87ad-7f24371f685d-kube-api-access-zr4hr\") on node \"crc\" DevicePath \"\"" Dec 05 23:34:22 crc kubenswrapper[4734]: I1205 23:34:22.839973 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73bd4d64-2b2a-4c73-87ad-7f24371f685d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:34:23 crc kubenswrapper[4734]: I1205 23:34:23.099839 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n5tlr" Dec 05 23:34:23 crc kubenswrapper[4734]: I1205 23:34:23.245391 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvhrn\" (UniqueName: \"kubernetes.io/projected/a6fa2f15-eddf-41a6-9299-8bd85f819a33-kube-api-access-xvhrn\") pod \"a6fa2f15-eddf-41a6-9299-8bd85f819a33\" (UID: \"a6fa2f15-eddf-41a6-9299-8bd85f819a33\") " Dec 05 23:34:23 crc kubenswrapper[4734]: I1205 23:34:23.245484 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6fa2f15-eddf-41a6-9299-8bd85f819a33-utilities\") pod \"a6fa2f15-eddf-41a6-9299-8bd85f819a33\" (UID: \"a6fa2f15-eddf-41a6-9299-8bd85f819a33\") " Dec 05 23:34:23 crc kubenswrapper[4734]: I1205 23:34:23.245614 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6fa2f15-eddf-41a6-9299-8bd85f819a33-catalog-content\") pod \"a6fa2f15-eddf-41a6-9299-8bd85f819a33\" (UID: \"a6fa2f15-eddf-41a6-9299-8bd85f819a33\") " Dec 05 23:34:23 crc kubenswrapper[4734]: I1205 23:34:23.246782 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6fa2f15-eddf-41a6-9299-8bd85f819a33-utilities" (OuterVolumeSpecName: "utilities") pod "a6fa2f15-eddf-41a6-9299-8bd85f819a33" (UID: "a6fa2f15-eddf-41a6-9299-8bd85f819a33"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:34:23 crc kubenswrapper[4734]: I1205 23:34:23.249569 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6fa2f15-eddf-41a6-9299-8bd85f819a33-kube-api-access-xvhrn" (OuterVolumeSpecName: "kube-api-access-xvhrn") pod "a6fa2f15-eddf-41a6-9299-8bd85f819a33" (UID: "a6fa2f15-eddf-41a6-9299-8bd85f819a33"). InnerVolumeSpecName "kube-api-access-xvhrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:34:23 crc kubenswrapper[4734]: I1205 23:34:23.296482 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6fa2f15-eddf-41a6-9299-8bd85f819a33-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6fa2f15-eddf-41a6-9299-8bd85f819a33" (UID: "a6fa2f15-eddf-41a6-9299-8bd85f819a33"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:34:23 crc kubenswrapper[4734]: I1205 23:34:23.347417 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6fa2f15-eddf-41a6-9299-8bd85f819a33-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:34:23 crc kubenswrapper[4734]: I1205 23:34:23.347459 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvhrn\" (UniqueName: \"kubernetes.io/projected/a6fa2f15-eddf-41a6-9299-8bd85f819a33-kube-api-access-xvhrn\") on node \"crc\" DevicePath \"\"" Dec 05 23:34:23 crc kubenswrapper[4734]: I1205 23:34:23.347473 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6fa2f15-eddf-41a6-9299-8bd85f819a33-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:34:23 crc kubenswrapper[4734]: I1205 23:34:23.374590 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n5tlr" Dec 05 23:34:23 crc kubenswrapper[4734]: I1205 23:34:23.374644 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n5tlr" event={"ID":"a6fa2f15-eddf-41a6-9299-8bd85f819a33","Type":"ContainerDied","Data":"b4e8533c75ae0aac9a5835c36f22e02a063228f50eccb02519bd06434007fd83"} Dec 05 23:34:23 crc kubenswrapper[4734]: I1205 23:34:23.374706 4734 scope.go:117] "RemoveContainer" containerID="4753e2e43b50d29e70337333f51a4146f55bbc0c2b1b559ab79eff1fcb2407c0" Dec 05 23:34:23 crc kubenswrapper[4734]: I1205 23:34:23.379949 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-npq48" event={"ID":"73bd4d64-2b2a-4c73-87ad-7f24371f685d","Type":"ContainerDied","Data":"b8f2dfad0dd84d23a414ad7a020103b39a24ae7cdbed7a3af64480c52fa31b1e"} Dec 05 23:34:23 crc kubenswrapper[4734]: I1205 23:34:23.380033 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-npq48" Dec 05 23:34:23 crc kubenswrapper[4734]: I1205 23:34:23.410434 4734 scope.go:117] "RemoveContainer" containerID="e9c8af3e53c63f10f3657ecd63a152cf6e92611f1e469dbdf851f2f00ca9cb6b" Dec 05 23:34:23 crc kubenswrapper[4734]: I1205 23:34:23.425558 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n5tlr"] Dec 05 23:34:23 crc kubenswrapper[4734]: I1205 23:34:23.432758 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n5tlr"] Dec 05 23:34:23 crc kubenswrapper[4734]: I1205 23:34:23.450892 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-npq48"] Dec 05 23:34:23 crc kubenswrapper[4734]: I1205 23:34:23.455098 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-npq48"] Dec 05 23:34:23 crc kubenswrapper[4734]: I1205 23:34:23.458322 4734 scope.go:117] "RemoveContainer" containerID="dc46da953e00304837889572886e0b2f6d9c5bdb0fbb210842f3f56f4cb4a905" Dec 05 23:34:23 crc kubenswrapper[4734]: I1205 23:34:23.475353 4734 scope.go:117] "RemoveContainer" containerID="40490c53d5e87da4d43858d6746f36a9e2de5aa2bb1481e85aac3cbd9725cabd" Dec 05 23:34:23 crc kubenswrapper[4734]: I1205 23:34:23.493692 4734 scope.go:117] "RemoveContainer" containerID="89af057ae36aeb638aadbc912da198e809baca50cbbe91df4ee4fead7ca30795" Dec 05 23:34:23 crc kubenswrapper[4734]: I1205 23:34:23.507490 4734 scope.go:117] "RemoveContainer" containerID="b8b9a4d06e11bfaf78efdd5b8b390b1b1a3b80cfdaea19da622405b3ab07f2f7" Dec 05 23:34:23 crc kubenswrapper[4734]: I1205 23:34:23.623191 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73bd4d64-2b2a-4c73-87ad-7f24371f685d" path="/var/lib/kubelet/pods/73bd4d64-2b2a-4c73-87ad-7f24371f685d/volumes" Dec 05 23:34:23 crc kubenswrapper[4734]: I1205 23:34:23.623895 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6fa2f15-eddf-41a6-9299-8bd85f819a33" path="/var/lib/kubelet/pods/a6fa2f15-eddf-41a6-9299-8bd85f819a33/volumes" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.735223 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-bl5vb"] Dec 05 23:34:45 crc kubenswrapper[4734]: E1205 23:34:45.736110 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73bd4d64-2b2a-4c73-87ad-7f24371f685d" containerName="registry-server" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.736130 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="73bd4d64-2b2a-4c73-87ad-7f24371f685d" containerName="registry-server" Dec 05 23:34:45 crc kubenswrapper[4734]: E1205 23:34:45.736140 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6fa2f15-eddf-41a6-9299-8bd85f819a33" containerName="registry-server" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.736146 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6fa2f15-eddf-41a6-9299-8bd85f819a33" containerName="registry-server" Dec 05 23:34:45 crc kubenswrapper[4734]: E1205 23:34:45.736159 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6fa2f15-eddf-41a6-9299-8bd85f819a33" containerName="extract-utilities" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.736167 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6fa2f15-eddf-41a6-9299-8bd85f819a33" containerName="extract-utilities" Dec 05 23:34:45 crc kubenswrapper[4734]: E1205 23:34:45.736176 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6fa2f15-eddf-41a6-9299-8bd85f819a33" containerName="extract-content" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.736182 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6fa2f15-eddf-41a6-9299-8bd85f819a33" containerName="extract-content" Dec 05 23:34:45 crc kubenswrapper[4734]: E1205 23:34:45.736193 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73bd4d64-2b2a-4c73-87ad-7f24371f685d" containerName="extract-content" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.736198 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="73bd4d64-2b2a-4c73-87ad-7f24371f685d" containerName="extract-content" Dec 05 23:34:45 crc kubenswrapper[4734]: E1205 23:34:45.736213 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73bd4d64-2b2a-4c73-87ad-7f24371f685d" containerName="extract-utilities" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.736219 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="73bd4d64-2b2a-4c73-87ad-7f24371f685d" containerName="extract-utilities" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.736349 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6fa2f15-eddf-41a6-9299-8bd85f819a33" containerName="registry-server" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.736361 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="73bd4d64-2b2a-4c73-87ad-7f24371f685d" containerName="registry-server" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.737073 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-bl5vb" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.739718 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-ccq4d" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.748640 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-4clqb"] Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.750009 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-4clqb" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.760013 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-bpc4h" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.767691 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-bl5vb"] Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.796737 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-4clqb"] Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.802106 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t42c7\" (UniqueName: \"kubernetes.io/projected/6ba0bb79-4132-4bd9-a2ce-c8a9b516402d-kube-api-access-t42c7\") pod \"cinder-operator-controller-manager-6c677c69b-4clqb\" (UID: \"6ba0bb79-4132-4bd9-a2ce-c8a9b516402d\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-4clqb" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.802173 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4sr2\" (UniqueName: \"kubernetes.io/projected/4ac00d0e-d1c1-44d8-869d-1d98f5a137e0-kube-api-access-l4sr2\") pod \"barbican-operator-controller-manager-7d9dfd778-bl5vb\" (UID: \"4ac00d0e-d1c1-44d8-869d-1d98f5a137e0\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-bl5vb" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.834122 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-fwbcd"] Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.835552 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-fwbcd" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.845334 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-c98gv" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.859290 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-rp9j7"] Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.860521 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-rp9j7" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.863003 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-8fltx" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.886154 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-fwbcd"] Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.900006 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-c7c94"] Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.901494 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-c7c94" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.904126 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4sr2\" (UniqueName: \"kubernetes.io/projected/4ac00d0e-d1c1-44d8-869d-1d98f5a137e0-kube-api-access-l4sr2\") pod \"barbican-operator-controller-manager-7d9dfd778-bl5vb\" (UID: \"4ac00d0e-d1c1-44d8-869d-1d98f5a137e0\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-bl5vb" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.905110 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t42c7\" (UniqueName: \"kubernetes.io/projected/6ba0bb79-4132-4bd9-a2ce-c8a9b516402d-kube-api-access-t42c7\") pod \"cinder-operator-controller-manager-6c677c69b-4clqb\" (UID: \"6ba0bb79-4132-4bd9-a2ce-c8a9b516402d\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-4clqb" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.907894 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-c7c94"] Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.911115 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-wn8qt" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.913673 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-rp9j7"] Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.916898 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mnmlv"] Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.918171 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mnmlv" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.922659 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-r2427"] Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.923775 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-r2427" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.927925 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-d8c72" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.928144 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-4w2td" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.928181 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.932483 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mnmlv"] Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.948317 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4sr2\" (UniqueName: \"kubernetes.io/projected/4ac00d0e-d1c1-44d8-869d-1d98f5a137e0-kube-api-access-l4sr2\") pod \"barbican-operator-controller-manager-7d9dfd778-bl5vb\" (UID: \"4ac00d0e-d1c1-44d8-869d-1d98f5a137e0\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-bl5vb" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.948386 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t42c7\" (UniqueName: \"kubernetes.io/projected/6ba0bb79-4132-4bd9-a2ce-c8a9b516402d-kube-api-access-t42c7\") pod \"cinder-operator-controller-manager-6c677c69b-4clqb\" (UID: \"6ba0bb79-4132-4bd9-a2ce-c8a9b516402d\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-4clqb" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.953644 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-rw8vg"] Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.954883 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-967d97867-rw8vg" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.959182 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-cmshd" Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.968991 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-r2427"] Dec 05 23:34:45 crc kubenswrapper[4734]: I1205 23:34:45.981762 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-rw8vg"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:45.999877 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-t5gsd"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.001081 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-t5gsd" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.008520 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr6fg\" (UniqueName: \"kubernetes.io/projected/4685a9c2-ef1c-462d-848c-fbbea6a8ebfe-kube-api-access-rr6fg\") pod \"heat-operator-controller-manager-5f64f6f8bb-c7c94\" (UID: \"4685a9c2-ef1c-462d-848c-fbbea6a8ebfe\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-c7c94" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.008597 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq99m\" (UniqueName: \"kubernetes.io/projected/c12a23f4-fdd7-455e-b74c-f757f15990ca-kube-api-access-jq99m\") pod \"glance-operator-controller-manager-5697bb5779-rp9j7\" (UID: \"c12a23f4-fdd7-455e-b74c-f757f15990ca\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-rp9j7" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.008680 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsjxz\" (UniqueName: \"kubernetes.io/projected/157817be-876f-4157-87af-6ef317b91cb9-kube-api-access-jsjxz\") pod \"designate-operator-controller-manager-697fb699cf-fwbcd\" (UID: \"157817be-876f-4157-87af-6ef317b91cb9\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-fwbcd" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.010289 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-zfw2l" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.033633 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-t5gsd"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.075717 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-lwpjm"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.091069 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-bl5vb" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.100114 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-lwpjm" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.108990 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-lhcxh" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.112254 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nc6wd"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.114094 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nc6wd" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.140706 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-4clqb" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.145812 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-2qq45" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.147994 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kzf8\" (UniqueName: \"kubernetes.io/projected/df5aaec7-4487-47a1-98c4-0206d0ecf7f4-kube-api-access-4kzf8\") pod \"infra-operator-controller-manager-78d48bff9d-r2427\" (UID: \"df5aaec7-4487-47a1-98c4-0206d0ecf7f4\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-r2427" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.148099 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zqbx\" (UniqueName: \"kubernetes.io/projected/9883e2bb-76f7-476d-8a74-e358ebf37ed2-kube-api-access-9zqbx\") pod \"manila-operator-controller-manager-7c79b5df47-lwpjm\" (UID: \"9883e2bb-76f7-476d-8a74-e358ebf37ed2\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-lwpjm" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.148189 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtv7s\" (UniqueName: \"kubernetes.io/projected/ef794353-3292-4809-94d8-105aaa36889e-kube-api-access-wtv7s\") pod \"ironic-operator-controller-manager-967d97867-rw8vg\" (UID: \"ef794353-3292-4809-94d8-105aaa36889e\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-rw8vg" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.148246 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsjxz\" (UniqueName: \"kubernetes.io/projected/157817be-876f-4157-87af-6ef317b91cb9-kube-api-access-jsjxz\") pod \"designate-operator-controller-manager-697fb699cf-fwbcd\" (UID: \"157817be-876f-4157-87af-6ef317b91cb9\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-fwbcd" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.148311 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgr4f\" (UniqueName: \"kubernetes.io/projected/3255ef71-c5a8-4fef-a1ab-dc2107c710eb-kube-api-access-sgr4f\") pod \"keystone-operator-controller-manager-7765d96ddf-t5gsd\" (UID: \"3255ef71-c5a8-4fef-a1ab-dc2107c710eb\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-t5gsd" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.148353 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr6fg\" (UniqueName: \"kubernetes.io/projected/4685a9c2-ef1c-462d-848c-fbbea6a8ebfe-kube-api-access-rr6fg\") pod \"heat-operator-controller-manager-5f64f6f8bb-c7c94\" (UID: \"4685a9c2-ef1c-462d-848c-fbbea6a8ebfe\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-c7c94" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.148379 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df5aaec7-4487-47a1-98c4-0206d0ecf7f4-cert\") pod \"infra-operator-controller-manager-78d48bff9d-r2427\" (UID: \"df5aaec7-4487-47a1-98c4-0206d0ecf7f4\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-r2427" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.148402 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjbs6\" (UniqueName: \"kubernetes.io/projected/9a792918-0311-4b1b-8920-a315370ecba7-kube-api-access-jjbs6\") pod \"horizon-operator-controller-manager-68c6d99b8f-mnmlv\" (UID: \"9a792918-0311-4b1b-8920-a315370ecba7\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mnmlv" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.148473 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq99m\" (UniqueName: \"kubernetes.io/projected/c12a23f4-fdd7-455e-b74c-f757f15990ca-kube-api-access-jq99m\") pod \"glance-operator-controller-manager-5697bb5779-rp9j7\" (UID: \"c12a23f4-fdd7-455e-b74c-f757f15990ca\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-rp9j7" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.152517 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-lwpjm"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.185617 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zx66z"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.192038 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsjxz\" (UniqueName: \"kubernetes.io/projected/157817be-876f-4157-87af-6ef317b91cb9-kube-api-access-jsjxz\") pod \"designate-operator-controller-manager-697fb699cf-fwbcd\" (UID: \"157817be-876f-4157-87af-6ef317b91cb9\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-fwbcd" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.193227 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr6fg\" (UniqueName: \"kubernetes.io/projected/4685a9c2-ef1c-462d-848c-fbbea6a8ebfe-kube-api-access-rr6fg\") pod \"heat-operator-controller-manager-5f64f6f8bb-c7c94\" (UID: \"4685a9c2-ef1c-462d-848c-fbbea6a8ebfe\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-c7c94" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.201086 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq99m\" (UniqueName: \"kubernetes.io/projected/c12a23f4-fdd7-455e-b74c-f757f15990ca-kube-api-access-jq99m\") pod \"glance-operator-controller-manager-5697bb5779-rp9j7\" (UID: \"c12a23f4-fdd7-455e-b74c-f757f15990ca\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-rp9j7" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.204127 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-rp9j7" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.204565 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zx66z" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.210872 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nc6wd"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.225150 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zx66z"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.231636 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-jwjgn" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.237117 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-c7c94" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.238362 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-wf4vr"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.239951 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-wf4vr" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.245006 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-7dvwz" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.249603 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtv7s\" (UniqueName: \"kubernetes.io/projected/ef794353-3292-4809-94d8-105aaa36889e-kube-api-access-wtv7s\") pod \"ironic-operator-controller-manager-967d97867-rw8vg\" (UID: \"ef794353-3292-4809-94d8-105aaa36889e\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-rw8vg" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.249767 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgr4f\" (UniqueName: \"kubernetes.io/projected/3255ef71-c5a8-4fef-a1ab-dc2107c710eb-kube-api-access-sgr4f\") pod \"keystone-operator-controller-manager-7765d96ddf-t5gsd\" (UID: \"3255ef71-c5a8-4fef-a1ab-dc2107c710eb\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-t5gsd" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.250299 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df5aaec7-4487-47a1-98c4-0206d0ecf7f4-cert\") pod \"infra-operator-controller-manager-78d48bff9d-r2427\" (UID: \"df5aaec7-4487-47a1-98c4-0206d0ecf7f4\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-r2427" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.250401 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjbs6\" (UniqueName: \"kubernetes.io/projected/9a792918-0311-4b1b-8920-a315370ecba7-kube-api-access-jjbs6\") pod \"horizon-operator-controller-manager-68c6d99b8f-mnmlv\" (UID: \"9a792918-0311-4b1b-8920-a315370ecba7\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mnmlv" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.250515 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kzf8\" (UniqueName: \"kubernetes.io/projected/df5aaec7-4487-47a1-98c4-0206d0ecf7f4-kube-api-access-4kzf8\") pod \"infra-operator-controller-manager-78d48bff9d-r2427\" (UID: \"df5aaec7-4487-47a1-98c4-0206d0ecf7f4\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-r2427" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.250649 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zqbx\" (UniqueName: \"kubernetes.io/projected/9883e2bb-76f7-476d-8a74-e358ebf37ed2-kube-api-access-9zqbx\") pod \"manila-operator-controller-manager-7c79b5df47-lwpjm\" (UID: \"9883e2bb-76f7-476d-8a74-e358ebf37ed2\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-lwpjm" Dec 05 23:34:46 crc kubenswrapper[4734]: E1205 23:34:46.251269 4734 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 23:34:46 crc kubenswrapper[4734]: E1205 23:34:46.251442 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df5aaec7-4487-47a1-98c4-0206d0ecf7f4-cert podName:df5aaec7-4487-47a1-98c4-0206d0ecf7f4 nodeName:}" failed. No retries permitted until 2025-12-05 23:34:46.751417541 +0000 UTC m=+907.434821817 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df5aaec7-4487-47a1-98c4-0206d0ecf7f4-cert") pod "infra-operator-controller-manager-78d48bff9d-r2427" (UID: "df5aaec7-4487-47a1-98c4-0206d0ecf7f4") : secret "infra-operator-webhook-server-cert" not found Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.267717 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-bf28l"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.273681 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjbs6\" (UniqueName: \"kubernetes.io/projected/9a792918-0311-4b1b-8920-a315370ecba7-kube-api-access-jjbs6\") pod \"horizon-operator-controller-manager-68c6d99b8f-mnmlv\" (UID: \"9a792918-0311-4b1b-8920-a315370ecba7\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mnmlv" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.273786 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgr4f\" (UniqueName: \"kubernetes.io/projected/3255ef71-c5a8-4fef-a1ab-dc2107c710eb-kube-api-access-sgr4f\") pod \"keystone-operator-controller-manager-7765d96ddf-t5gsd\" (UID: \"3255ef71-c5a8-4fef-a1ab-dc2107c710eb\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-t5gsd" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.274123 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kzf8\" (UniqueName: \"kubernetes.io/projected/df5aaec7-4487-47a1-98c4-0206d0ecf7f4-kube-api-access-4kzf8\") pod \"infra-operator-controller-manager-78d48bff9d-r2427\" (UID: \"df5aaec7-4487-47a1-98c4-0206d0ecf7f4\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-r2427" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.275996 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-bf28l" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.284171 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zqbx\" (UniqueName: \"kubernetes.io/projected/9883e2bb-76f7-476d-8a74-e358ebf37ed2-kube-api-access-9zqbx\") pod \"manila-operator-controller-manager-7c79b5df47-lwpjm\" (UID: \"9883e2bb-76f7-476d-8a74-e358ebf37ed2\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-lwpjm" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.287232 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-n67wp" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.299866 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mnmlv" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.304679 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-wf4vr"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.324560 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtv7s\" (UniqueName: \"kubernetes.io/projected/ef794353-3292-4809-94d8-105aaa36889e-kube-api-access-wtv7s\") pod \"ironic-operator-controller-manager-967d97867-rw8vg\" (UID: \"ef794353-3292-4809-94d8-105aaa36889e\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-rw8vg" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.331500 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-bf28l"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.337667 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fhnqmj"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.339046 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fhnqmj" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.344500 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.345087 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-z42zm" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.349742 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-hdt6h"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.354483 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg7r2\" (UniqueName: \"kubernetes.io/projected/58aa2c14-9374-45b1-b6dd-07e849f23306-kube-api-access-bg7r2\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-zx66z\" (UID: \"58aa2c14-9374-45b1-b6dd-07e849f23306\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zx66z" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.354555 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twjbx\" (UniqueName: \"kubernetes.io/projected/aa5ccaa9-5087-4891-b255-a5135271a2a5-kube-api-access-twjbx\") pod \"nova-operator-controller-manager-697bc559fc-wf4vr\" (UID: \"aa5ccaa9-5087-4891-b255-a5135271a2a5\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-wf4vr" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.354583 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsk7f\" (UniqueName: \"kubernetes.io/projected/608bca6a-1cb5-44b9-91c6-32a77372a4e5-kube-api-access-lsk7f\") pod \"mariadb-operator-controller-manager-79c8c4686c-nc6wd\" (UID: \"608bca6a-1cb5-44b9-91c6-32a77372a4e5\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nc6wd" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.355054 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hdt6h" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.357139 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fhnqmj"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.359380 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-pt6wd" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.369735 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-967d97867-rw8vg" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.372603 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-hdt6h"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.403287 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-w8l4b"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.404680 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-w8l4b" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.406992 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-wjc86" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.409764 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-w8l4b"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.410022 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-t5gsd" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.432572 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-cdtjx"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.443964 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-cdtjx" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.448882 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-sdpwm" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.449066 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-cdtjx"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.459335 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cce8abe-4425-4cea-ac4f-3fd707bd5737-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fhnqmj\" (UID: \"9cce8abe-4425-4cea-ac4f-3fd707bd5737\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fhnqmj" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.459390 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5tdn\" (UniqueName: \"kubernetes.io/projected/3ab5c543-f1e6-455c-a051-7940ffcc833d-kube-api-access-v5tdn\") pod \"octavia-operator-controller-manager-998648c74-bf28l\" (UID: \"3ab5c543-f1e6-455c-a051-7940ffcc833d\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-bf28l" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.459602 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg7r2\" (UniqueName: \"kubernetes.io/projected/58aa2c14-9374-45b1-b6dd-07e849f23306-kube-api-access-bg7r2\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-zx66z\" (UID: \"58aa2c14-9374-45b1-b6dd-07e849f23306\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zx66z" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.459666 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twjbx\" (UniqueName: \"kubernetes.io/projected/aa5ccaa9-5087-4891-b255-a5135271a2a5-kube-api-access-twjbx\") pod \"nova-operator-controller-manager-697bc559fc-wf4vr\" (UID: \"aa5ccaa9-5087-4891-b255-a5135271a2a5\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-wf4vr" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.459691 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsk7f\" (UniqueName: \"kubernetes.io/projected/608bca6a-1cb5-44b9-91c6-32a77372a4e5-kube-api-access-lsk7f\") pod \"mariadb-operator-controller-manager-79c8c4686c-nc6wd\" (UID: \"608bca6a-1cb5-44b9-91c6-32a77372a4e5\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nc6wd" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.459734 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbkcf\" (UniqueName: \"kubernetes.io/projected/ea29b614-e490-4a3e-925e-d9f6c56b0c35-kube-api-access-rbkcf\") pod \"ovn-operator-controller-manager-b6456fdb6-hdt6h\" (UID: \"ea29b614-e490-4a3e-925e-d9f6c56b0c35\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hdt6h" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.459884 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p7c6\" (UniqueName: \"kubernetes.io/projected/9cce8abe-4425-4cea-ac4f-3fd707bd5737-kube-api-access-2p7c6\") pod \"openstack-baremetal-operator-controller-manager-84b575879fhnqmj\" (UID: \"9cce8abe-4425-4cea-ac4f-3fd707bd5737\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fhnqmj" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.477656 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2fm4z"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.478509 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-fwbcd" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.479112 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2fm4z"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.479242 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2fm4z" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.491747 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-pnpbm" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.494674 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-j4m2j"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.496156 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-j4m2j" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.498156 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-rv5cv" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.503567 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsk7f\" (UniqueName: \"kubernetes.io/projected/608bca6a-1cb5-44b9-91c6-32a77372a4e5-kube-api-access-lsk7f\") pod \"mariadb-operator-controller-manager-79c8c4686c-nc6wd\" (UID: \"608bca6a-1cb5-44b9-91c6-32a77372a4e5\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nc6wd" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.513826 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twjbx\" (UniqueName: \"kubernetes.io/projected/aa5ccaa9-5087-4891-b255-a5135271a2a5-kube-api-access-twjbx\") pod \"nova-operator-controller-manager-697bc559fc-wf4vr\" (UID: \"aa5ccaa9-5087-4891-b255-a5135271a2a5\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-wf4vr" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.513926 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-j4m2j"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.518231 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg7r2\" (UniqueName: \"kubernetes.io/projected/58aa2c14-9374-45b1-b6dd-07e849f23306-kube-api-access-bg7r2\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-zx66z\" (UID: \"58aa2c14-9374-45b1-b6dd-07e849f23306\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zx66z" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.540962 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-lwpjm" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.549658 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nc6wd" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.556362 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zx66z" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.559302 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-667bd8d554-xwqfj"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.562696 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbkcf\" (UniqueName: \"kubernetes.io/projected/ea29b614-e490-4a3e-925e-d9f6c56b0c35-kube-api-access-rbkcf\") pod \"ovn-operator-controller-manager-b6456fdb6-hdt6h\" (UID: \"ea29b614-e490-4a3e-925e-d9f6c56b0c35\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hdt6h" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.562763 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvnzt\" (UniqueName: \"kubernetes.io/projected/ad6bda6e-964f-44c3-b759-ad151097b4f1-kube-api-access-vvnzt\") pod \"placement-operator-controller-manager-78f8948974-w8l4b\" (UID: \"ad6bda6e-964f-44c3-b759-ad151097b4f1\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-w8l4b" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.562792 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p7c6\" (UniqueName: \"kubernetes.io/projected/9cce8abe-4425-4cea-ac4f-3fd707bd5737-kube-api-access-2p7c6\") pod \"openstack-baremetal-operator-controller-manager-84b575879fhnqmj\" (UID: \"9cce8abe-4425-4cea-ac4f-3fd707bd5737\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fhnqmj" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.562811 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4h8q\" (UniqueName: \"kubernetes.io/projected/2050fd66-c55a-4048-a869-cb786b5f0d2b-kube-api-access-w4h8q\") pod \"swift-operator-controller-manager-9d58d64bc-cdtjx\" (UID: \"2050fd66-c55a-4048-a869-cb786b5f0d2b\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-cdtjx" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.562827 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjqh5\" (UniqueName: \"kubernetes.io/projected/696f07ba-7c46-41f2-826f-890756824285-kube-api-access-qjqh5\") pod \"telemetry-operator-controller-manager-58d5ff84df-2fm4z\" (UID: \"696f07ba-7c46-41f2-826f-890756824285\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2fm4z" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.562844 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cce8abe-4425-4cea-ac4f-3fd707bd5737-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fhnqmj\" (UID: \"9cce8abe-4425-4cea-ac4f-3fd707bd5737\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fhnqmj" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.562863 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5tdn\" (UniqueName: \"kubernetes.io/projected/3ab5c543-f1e6-455c-a051-7940ffcc833d-kube-api-access-v5tdn\") pod \"octavia-operator-controller-manager-998648c74-bf28l\" (UID: \"3ab5c543-f1e6-455c-a051-7940ffcc833d\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-bf28l" Dec 05 23:34:46 crc kubenswrapper[4734]: E1205 23:34:46.563760 4734 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 23:34:46 crc kubenswrapper[4734]: E1205 23:34:46.563814 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cce8abe-4425-4cea-ac4f-3fd707bd5737-cert podName:9cce8abe-4425-4cea-ac4f-3fd707bd5737 nodeName:}" failed. No retries permitted until 2025-12-05 23:34:47.063793981 +0000 UTC m=+907.747198257 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9cce8abe-4425-4cea-ac4f-3fd707bd5737-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fhnqmj" (UID: "9cce8abe-4425-4cea-ac4f-3fd707bd5737") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.566933 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-xwqfj" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.590351 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-mjdr7" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.597943 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-667bd8d554-xwqfj"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.606930 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-wf4vr" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.607414 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5tdn\" (UniqueName: \"kubernetes.io/projected/3ab5c543-f1e6-455c-a051-7940ffcc833d-kube-api-access-v5tdn\") pod \"octavia-operator-controller-manager-998648c74-bf28l\" (UID: \"3ab5c543-f1e6-455c-a051-7940ffcc833d\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-bf28l" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.642667 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p7c6\" (UniqueName: \"kubernetes.io/projected/9cce8abe-4425-4cea-ac4f-3fd707bd5737-kube-api-access-2p7c6\") pod \"openstack-baremetal-operator-controller-manager-84b575879fhnqmj\" (UID: \"9cce8abe-4425-4cea-ac4f-3fd707bd5737\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fhnqmj" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.642769 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbkcf\" (UniqueName: \"kubernetes.io/projected/ea29b614-e490-4a3e-925e-d9f6c56b0c35-kube-api-access-rbkcf\") pod \"ovn-operator-controller-manager-b6456fdb6-hdt6h\" (UID: \"ea29b614-e490-4a3e-925e-d9f6c56b0c35\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hdt6h" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.669268 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvnzt\" (UniqueName: \"kubernetes.io/projected/ad6bda6e-964f-44c3-b759-ad151097b4f1-kube-api-access-vvnzt\") pod \"placement-operator-controller-manager-78f8948974-w8l4b\" (UID: \"ad6bda6e-964f-44c3-b759-ad151097b4f1\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-w8l4b" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.669330 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q67qq\" (UniqueName: \"kubernetes.io/projected/b7ee6df9-99e2-480d-aa84-7618ff0cda2f-kube-api-access-q67qq\") pod \"watcher-operator-controller-manager-667bd8d554-xwqfj\" (UID: \"b7ee6df9-99e2-480d-aa84-7618ff0cda2f\") " pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-xwqfj" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.669374 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4h8q\" (UniqueName: \"kubernetes.io/projected/2050fd66-c55a-4048-a869-cb786b5f0d2b-kube-api-access-w4h8q\") pod \"swift-operator-controller-manager-9d58d64bc-cdtjx\" (UID: \"2050fd66-c55a-4048-a869-cb786b5f0d2b\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-cdtjx" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.669409 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjqh5\" (UniqueName: \"kubernetes.io/projected/696f07ba-7c46-41f2-826f-890756824285-kube-api-access-qjqh5\") pod \"telemetry-operator-controller-manager-58d5ff84df-2fm4z\" (UID: \"696f07ba-7c46-41f2-826f-890756824285\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2fm4z" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.669445 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkkdn\" (UniqueName: \"kubernetes.io/projected/974bff7e-6bfc-49c2-9d3d-831d1bf5385d-kube-api-access-jkkdn\") pod \"test-operator-controller-manager-5854674fcc-j4m2j\" (UID: \"974bff7e-6bfc-49c2-9d3d-831d1bf5385d\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-j4m2j" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.693327 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-bf28l" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.729020 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjqh5\" (UniqueName: \"kubernetes.io/projected/696f07ba-7c46-41f2-826f-890756824285-kube-api-access-qjqh5\") pod \"telemetry-operator-controller-manager-58d5ff84df-2fm4z\" (UID: \"696f07ba-7c46-41f2-826f-890756824285\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2fm4z" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.730374 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4h8q\" (UniqueName: \"kubernetes.io/projected/2050fd66-c55a-4048-a869-cb786b5f0d2b-kube-api-access-w4h8q\") pod \"swift-operator-controller-manager-9d58d64bc-cdtjx\" (UID: \"2050fd66-c55a-4048-a869-cb786b5f0d2b\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-cdtjx" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.749744 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hdt6h" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.749745 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.751467 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvnzt\" (UniqueName: \"kubernetes.io/projected/ad6bda6e-964f-44c3-b759-ad151097b4f1-kube-api-access-vvnzt\") pod \"placement-operator-controller-manager-78f8948974-w8l4b\" (UID: \"ad6bda6e-964f-44c3-b759-ad151097b4f1\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-w8l4b" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.753551 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.765323 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-w8l4b" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.769627 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.775622 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df5aaec7-4487-47a1-98c4-0206d0ecf7f4-cert\") pod \"infra-operator-controller-manager-78d48bff9d-r2427\" (UID: \"df5aaec7-4487-47a1-98c4-0206d0ecf7f4\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-r2427" Dec 05 23:34:46 crc kubenswrapper[4734]: E1205 23:34:46.780505 4734 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 23:34:46 crc kubenswrapper[4734]: E1205 23:34:46.780614 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df5aaec7-4487-47a1-98c4-0206d0ecf7f4-cert podName:df5aaec7-4487-47a1-98c4-0206d0ecf7f4 nodeName:}" failed. No retries permitted until 2025-12-05 23:34:47.780589719 +0000 UTC m=+908.463993995 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df5aaec7-4487-47a1-98c4-0206d0ecf7f4-cert") pod "infra-operator-controller-manager-78d48bff9d-r2427" (UID: "df5aaec7-4487-47a1-98c4-0206d0ecf7f4") : secret "infra-operator-webhook-server-cert" not found Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.782090 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.782504 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-rpr7m" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.782744 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.806754 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q67qq\" (UniqueName: \"kubernetes.io/projected/b7ee6df9-99e2-480d-aa84-7618ff0cda2f-kube-api-access-q67qq\") pod \"watcher-operator-controller-manager-667bd8d554-xwqfj\" (UID: \"b7ee6df9-99e2-480d-aa84-7618ff0cda2f\") " pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-xwqfj" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.806848 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkkdn\" (UniqueName: \"kubernetes.io/projected/974bff7e-6bfc-49c2-9d3d-831d1bf5385d-kube-api-access-jkkdn\") pod \"test-operator-controller-manager-5854674fcc-j4m2j\" (UID: \"974bff7e-6bfc-49c2-9d3d-831d1bf5385d\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-j4m2j" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.807658 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-cdtjx" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.836031 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2fm4z" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.836268 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q67qq\" (UniqueName: \"kubernetes.io/projected/b7ee6df9-99e2-480d-aa84-7618ff0cda2f-kube-api-access-q67qq\") pod \"watcher-operator-controller-manager-667bd8d554-xwqfj\" (UID: \"b7ee6df9-99e2-480d-aa84-7618ff0cda2f\") " pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-xwqfj" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.857951 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkkdn\" (UniqueName: \"kubernetes.io/projected/974bff7e-6bfc-49c2-9d3d-831d1bf5385d-kube-api-access-jkkdn\") pod \"test-operator-controller-manager-5854674fcc-j4m2j\" (UID: \"974bff7e-6bfc-49c2-9d3d-831d1bf5385d\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-j4m2j" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.888139 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4bt9z"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.889556 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4bt9z" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.896606 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4bt9z"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.903603 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-hzpk8" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.908588 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-webhook-certs\") pod \"openstack-operator-controller-manager-5845f76896-vhzwq\" (UID: \"2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e\") " pod="openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.908649 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd8xt\" (UniqueName: \"kubernetes.io/projected/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-kube-api-access-kd8xt\") pod \"openstack-operator-controller-manager-5845f76896-vhzwq\" (UID: \"2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e\") " pod="openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.908739 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-metrics-certs\") pod \"openstack-operator-controller-manager-5845f76896-vhzwq\" (UID: \"2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e\") " pod="openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq" Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.944190 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-bl5vb"] Dec 05 23:34:46 crc kubenswrapper[4734]: I1205 23:34:46.972711 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-xwqfj" Dec 05 23:34:47 crc kubenswrapper[4734]: I1205 23:34:47.010703 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-metrics-certs\") pod \"openstack-operator-controller-manager-5845f76896-vhzwq\" (UID: \"2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e\") " pod="openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq" Dec 05 23:34:47 crc kubenswrapper[4734]: I1205 23:34:47.010930 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-webhook-certs\") pod \"openstack-operator-controller-manager-5845f76896-vhzwq\" (UID: \"2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e\") " pod="openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq" Dec 05 23:34:47 crc kubenswrapper[4734]: I1205 23:34:47.010963 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd8xt\" (UniqueName: \"kubernetes.io/projected/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-kube-api-access-kd8xt\") pod \"openstack-operator-controller-manager-5845f76896-vhzwq\" (UID: \"2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e\") " pod="openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq" Dec 05 23:34:47 crc kubenswrapper[4734]: I1205 23:34:47.011022 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2r6d\" (UniqueName: \"kubernetes.io/projected/43c8ec4c-96f9-47f0-9313-2813ea1c62c2-kube-api-access-m2r6d\") pod \"rabbitmq-cluster-operator-manager-668c99d594-4bt9z\" (UID: \"43c8ec4c-96f9-47f0-9313-2813ea1c62c2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4bt9z" Dec 05 23:34:47 crc kubenswrapper[4734]: E1205 23:34:47.011284 4734 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 23:34:47 crc kubenswrapper[4734]: E1205 23:34:47.011359 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-metrics-certs podName:2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e nodeName:}" failed. No retries permitted until 2025-12-05 23:34:47.511318213 +0000 UTC m=+908.194722489 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-metrics-certs") pod "openstack-operator-controller-manager-5845f76896-vhzwq" (UID: "2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e") : secret "metrics-server-cert" not found Dec 05 23:34:47 crc kubenswrapper[4734]: E1205 23:34:47.011815 4734 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 23:34:47 crc kubenswrapper[4734]: E1205 23:34:47.011843 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-webhook-certs podName:2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e nodeName:}" failed. No retries permitted until 2025-12-05 23:34:47.511835735 +0000 UTC m=+908.195240011 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-webhook-certs") pod "openstack-operator-controller-manager-5845f76896-vhzwq" (UID: "2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e") : secret "webhook-server-cert" not found Dec 05 23:34:47 crc kubenswrapper[4734]: I1205 23:34:47.057718 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd8xt\" (UniqueName: \"kubernetes.io/projected/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-kube-api-access-kd8xt\") pod \"openstack-operator-controller-manager-5845f76896-vhzwq\" (UID: \"2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e\") " pod="openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq" Dec 05 23:34:47 crc kubenswrapper[4734]: I1205 23:34:47.114973 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2r6d\" (UniqueName: \"kubernetes.io/projected/43c8ec4c-96f9-47f0-9313-2813ea1c62c2-kube-api-access-m2r6d\") pod \"rabbitmq-cluster-operator-manager-668c99d594-4bt9z\" (UID: \"43c8ec4c-96f9-47f0-9313-2813ea1c62c2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4bt9z" Dec 05 23:34:47 crc kubenswrapper[4734]: I1205 23:34:47.115393 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cce8abe-4425-4cea-ac4f-3fd707bd5737-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fhnqmj\" (UID: \"9cce8abe-4425-4cea-ac4f-3fd707bd5737\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fhnqmj" Dec 05 23:34:47 crc kubenswrapper[4734]: E1205 23:34:47.115543 4734 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 23:34:47 crc kubenswrapper[4734]: E1205 23:34:47.115607 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cce8abe-4425-4cea-ac4f-3fd707bd5737-cert podName:9cce8abe-4425-4cea-ac4f-3fd707bd5737 nodeName:}" failed. No retries permitted until 2025-12-05 23:34:48.115588967 +0000 UTC m=+908.798993253 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9cce8abe-4425-4cea-ac4f-3fd707bd5737-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fhnqmj" (UID: "9cce8abe-4425-4cea-ac4f-3fd707bd5737") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 23:34:47 crc kubenswrapper[4734]: I1205 23:34:47.153586 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2r6d\" (UniqueName: \"kubernetes.io/projected/43c8ec4c-96f9-47f0-9313-2813ea1c62c2-kube-api-access-m2r6d\") pod \"rabbitmq-cluster-operator-manager-668c99d594-4bt9z\" (UID: \"43c8ec4c-96f9-47f0-9313-2813ea1c62c2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4bt9z" Dec 05 23:34:47 crc kubenswrapper[4734]: I1205 23:34:47.156681 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-j4m2j" Dec 05 23:34:47 crc kubenswrapper[4734]: I1205 23:34:47.170205 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4bt9z" Dec 05 23:34:47 crc kubenswrapper[4734]: I1205 23:34:47.376100 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mnmlv"] Dec 05 23:34:47 crc kubenswrapper[4734]: I1205 23:34:47.419811 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-4clqb"] Dec 05 23:34:47 crc kubenswrapper[4734]: I1205 23:34:47.521329 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-webhook-certs\") pod \"openstack-operator-controller-manager-5845f76896-vhzwq\" (UID: \"2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e\") " pod="openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq" Dec 05 23:34:47 crc kubenswrapper[4734]: E1205 23:34:47.521591 4734 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 23:34:47 crc kubenswrapper[4734]: I1205 23:34:47.522101 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-metrics-certs\") pod \"openstack-operator-controller-manager-5845f76896-vhzwq\" (UID: \"2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e\") " pod="openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq" Dec 05 23:34:47 crc kubenswrapper[4734]: E1205 23:34:47.522204 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-webhook-certs podName:2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e nodeName:}" failed. No retries permitted until 2025-12-05 23:34:48.522177418 +0000 UTC m=+909.205581694 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-webhook-certs") pod "openstack-operator-controller-manager-5845f76896-vhzwq" (UID: "2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e") : secret "webhook-server-cert" not found Dec 05 23:34:47 crc kubenswrapper[4734]: E1205 23:34:47.522326 4734 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 23:34:47 crc kubenswrapper[4734]: E1205 23:34:47.522375 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-metrics-certs podName:2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e nodeName:}" failed. No retries permitted until 2025-12-05 23:34:48.522362992 +0000 UTC m=+909.205767278 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-metrics-certs") pod "openstack-operator-controller-manager-5845f76896-vhzwq" (UID: "2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e") : secret "metrics-server-cert" not found Dec 05 23:34:47 crc kubenswrapper[4734]: I1205 23:34:47.587170 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-4clqb" event={"ID":"6ba0bb79-4132-4bd9-a2ce-c8a9b516402d","Type":"ContainerStarted","Data":"3701ebfcb8ebaa71698f4ba3fdc1a6def6795f5fea8a2ab39692b7d57e677da1"} Dec 05 23:34:47 crc kubenswrapper[4734]: I1205 23:34:47.590314 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-bl5vb" event={"ID":"4ac00d0e-d1c1-44d8-869d-1d98f5a137e0","Type":"ContainerStarted","Data":"e26579dd8c79d4bd57e90932c77e24b2713f679e343dcc844832fb7b081d5266"} Dec 05 23:34:47 crc kubenswrapper[4734]: I1205 23:34:47.596996 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mnmlv" event={"ID":"9a792918-0311-4b1b-8920-a315370ecba7","Type":"ContainerStarted","Data":"b112b4f1deb744b12aaf67a2241ec77579d5a7a01438f5e539879fcfff4cb1d9"} Dec 05 23:34:47 crc kubenswrapper[4734]: I1205 23:34:47.631469 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-rw8vg"] Dec 05 23:34:47 crc kubenswrapper[4734]: I1205 23:34:47.641368 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-c7c94"] Dec 05 23:34:47 crc kubenswrapper[4734]: W1205 23:34:47.641838 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef794353_3292_4809_94d8_105aaa36889e.slice/crio-4d74926a23e4b2fbac99afdd0dbc6081015ae7927afc6567df3a2249dc8c18c6 WatchSource:0}: Error finding container 4d74926a23e4b2fbac99afdd0dbc6081015ae7927afc6567df3a2249dc8c18c6: Status 404 returned error can't find the container with id 4d74926a23e4b2fbac99afdd0dbc6081015ae7927afc6567df3a2249dc8c18c6 Dec 05 23:34:47 crc kubenswrapper[4734]: W1205 23:34:47.645187 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4685a9c2_ef1c_462d_848c_fbbea6a8ebfe.slice/crio-db00a9517993e4d98387ad874b62fe9bf782563a6610deaa28652fd26ef387c6 WatchSource:0}: Error finding container db00a9517993e4d98387ad874b62fe9bf782563a6610deaa28652fd26ef387c6: Status 404 returned error can't find the container with id db00a9517993e4d98387ad874b62fe9bf782563a6610deaa28652fd26ef387c6 Dec 05 23:34:47 crc kubenswrapper[4734]: I1205 23:34:47.658593 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-t5gsd"] Dec 05 23:34:47 crc kubenswrapper[4734]: I1205 23:34:47.815693 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-fwbcd"] Dec 05 23:34:47 crc kubenswrapper[4734]: W1205 23:34:47.825065 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58aa2c14_9374_45b1_b6dd_07e849f23306.slice/crio-fa382185172f23d802f6b176212f721f7206a916ee09fe9166609cafc06bdbe8 WatchSource:0}: Error finding container fa382185172f23d802f6b176212f721f7206a916ee09fe9166609cafc06bdbe8: Status 404 returned error can't find the container with id fa382185172f23d802f6b176212f721f7206a916ee09fe9166609cafc06bdbe8 Dec 05 23:34:47 crc kubenswrapper[4734]: I1205 23:34:47.825685 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zx66z"] Dec 05 23:34:47 crc kubenswrapper[4734]: I1205 23:34:47.827833 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df5aaec7-4487-47a1-98c4-0206d0ecf7f4-cert\") pod \"infra-operator-controller-manager-78d48bff9d-r2427\" (UID: \"df5aaec7-4487-47a1-98c4-0206d0ecf7f4\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-r2427" Dec 05 23:34:47 crc kubenswrapper[4734]: E1205 23:34:47.827981 4734 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 23:34:47 crc kubenswrapper[4734]: E1205 23:34:47.828026 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df5aaec7-4487-47a1-98c4-0206d0ecf7f4-cert podName:df5aaec7-4487-47a1-98c4-0206d0ecf7f4 nodeName:}" failed. No retries permitted until 2025-12-05 23:34:49.828011932 +0000 UTC m=+910.511416208 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df5aaec7-4487-47a1-98c4-0206d0ecf7f4-cert") pod "infra-operator-controller-manager-78d48bff9d-r2427" (UID: "df5aaec7-4487-47a1-98c4-0206d0ecf7f4") : secret "infra-operator-webhook-server-cert" not found Dec 05 23:34:47 crc kubenswrapper[4734]: I1205 23:34:47.836559 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-lwpjm"] Dec 05 23:34:47 crc kubenswrapper[4734]: W1205 23:34:47.843392 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc12a23f4_fdd7_455e_b74c_f757f15990ca.slice/crio-b2271d0cd034a831cf94c043b6d5e99962af796f3f9c9bd1ca69b5f0b487dcfb WatchSource:0}: Error finding container b2271d0cd034a831cf94c043b6d5e99962af796f3f9c9bd1ca69b5f0b487dcfb: Status 404 returned error can't find the container with id b2271d0cd034a831cf94c043b6d5e99962af796f3f9c9bd1ca69b5f0b487dcfb Dec 05 23:34:47 crc kubenswrapper[4734]: I1205 23:34:47.844651 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-cdtjx"] Dec 05 23:34:47 crc kubenswrapper[4734]: I1205 23:34:47.853096 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-rp9j7"] Dec 05 23:34:47 crc kubenswrapper[4734]: I1205 23:34:47.860971 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nc6wd"] Dec 05 23:34:48 crc kubenswrapper[4734]: I1205 23:34:48.132755 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cce8abe-4425-4cea-ac4f-3fd707bd5737-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fhnqmj\" (UID: \"9cce8abe-4425-4cea-ac4f-3fd707bd5737\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fhnqmj" Dec 05 23:34:48 crc kubenswrapper[4734]: E1205 23:34:48.132973 4734 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 23:34:48 crc kubenswrapper[4734]: E1205 23:34:48.133026 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cce8abe-4425-4cea-ac4f-3fd707bd5737-cert podName:9cce8abe-4425-4cea-ac4f-3fd707bd5737 nodeName:}" failed. No retries permitted until 2025-12-05 23:34:50.133012147 +0000 UTC m=+910.816416423 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9cce8abe-4425-4cea-ac4f-3fd707bd5737-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fhnqmj" (UID: "9cce8abe-4425-4cea-ac4f-3fd707bd5737") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 23:34:48 crc kubenswrapper[4734]: I1205 23:34:48.208518 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-667bd8d554-xwqfj"] Dec 05 23:34:48 crc kubenswrapper[4734]: I1205 23:34:48.238659 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4bt9z"] Dec 05 23:34:48 crc kubenswrapper[4734]: I1205 23:34:48.267909 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-w8l4b"] Dec 05 23:34:48 crc kubenswrapper[4734]: W1205 23:34:48.270681 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43c8ec4c_96f9_47f0_9313_2813ea1c62c2.slice/crio-9eadf35f33da33e2644629daa18cb7b4bd455718ef29ba0a2d1e967e8b555a89 WatchSource:0}: Error finding container 9eadf35f33da33e2644629daa18cb7b4bd455718ef29ba0a2d1e967e8b555a89: Status 404 returned error can't find the container with id 9eadf35f33da33e2644629daa18cb7b4bd455718ef29ba0a2d1e967e8b555a89 Dec 05 23:34:48 crc kubenswrapper[4734]: W1205 23:34:48.285723 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad6bda6e_964f_44c3_b759_ad151097b4f1.slice/crio-c1aa2ca076ee09c38a3de1904893cb058681c959e39ddf547bc2a7b9e319de8d WatchSource:0}: Error finding container c1aa2ca076ee09c38a3de1904893cb058681c959e39ddf547bc2a7b9e319de8d: Status 404 returned error can't find the container with id c1aa2ca076ee09c38a3de1904893cb058681c959e39ddf547bc2a7b9e319de8d Dec 05 23:34:48 crc kubenswrapper[4734]: I1205 23:34:48.287073 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-wf4vr"] Dec 05 23:34:48 crc kubenswrapper[4734]: I1205 23:34:48.293596 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-hdt6h"] Dec 05 23:34:48 crc kubenswrapper[4734]: E1205 23:34:48.293934 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qjqh5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-58d5ff84df-2fm4z_openstack-operators(696f07ba-7c46-41f2-826f-890756824285): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 23:34:48 crc kubenswrapper[4734]: E1205 23:34:48.294069 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-twjbx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-wf4vr_openstack-operators(aa5ccaa9-5087-4891-b255-a5135271a2a5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 23:34:48 crc kubenswrapper[4734]: E1205 23:34:48.310718 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qjqh5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-58d5ff84df-2fm4z_openstack-operators(696f07ba-7c46-41f2-826f-890756824285): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 23:34:48 crc kubenswrapper[4734]: I1205 23:34:48.310763 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2fm4z"] Dec 05 23:34:48 crc kubenswrapper[4734]: E1205 23:34:48.311293 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-twjbx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-wf4vr_openstack-operators(aa5ccaa9-5087-4891-b255-a5135271a2a5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 23:34:48 crc kubenswrapper[4734]: E1205 23:34:48.312439 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2fm4z" podUID="696f07ba-7c46-41f2-826f-890756824285" Dec 05 23:34:48 crc kubenswrapper[4734]: E1205 23:34:48.312669 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-wf4vr" podUID="aa5ccaa9-5087-4891-b255-a5135271a2a5" Dec 05 23:34:48 crc kubenswrapper[4734]: E1205 23:34:48.323077 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rbkcf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-hdt6h_openstack-operators(ea29b614-e490-4a3e-925e-d9f6c56b0c35): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 23:34:48 crc kubenswrapper[4734]: E1205 23:34:48.329241 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rbkcf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-hdt6h_openstack-operators(ea29b614-e490-4a3e-925e-d9f6c56b0c35): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 23:34:48 crc kubenswrapper[4734]: I1205 23:34:48.329460 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-bf28l"] Dec 05 23:34:48 crc kubenswrapper[4734]: E1205 23:34:48.329483 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v5tdn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-bf28l_openstack-operators(3ab5c543-f1e6-455c-a051-7940ffcc833d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 23:34:48 crc kubenswrapper[4734]: E1205 23:34:48.331015 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hdt6h" podUID="ea29b614-e490-4a3e-925e-d9f6c56b0c35" Dec 05 23:34:48 crc kubenswrapper[4734]: E1205 23:34:48.335440 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v5tdn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-bf28l_openstack-operators(3ab5c543-f1e6-455c-a051-7940ffcc833d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 23:34:48 crc kubenswrapper[4734]: E1205 23:34:48.335966 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jkkdn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-j4m2j_openstack-operators(974bff7e-6bfc-49c2-9d3d-831d1bf5385d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 23:34:48 crc kubenswrapper[4734]: E1205 23:34:48.337986 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-bf28l" podUID="3ab5c543-f1e6-455c-a051-7940ffcc833d" Dec 05 23:34:48 crc kubenswrapper[4734]: E1205 23:34:48.340191 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jkkdn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-j4m2j_openstack-operators(974bff7e-6bfc-49c2-9d3d-831d1bf5385d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 23:34:48 crc kubenswrapper[4734]: I1205 23:34:48.341555 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-j4m2j"] Dec 05 23:34:48 crc kubenswrapper[4734]: E1205 23:34:48.341615 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-j4m2j" podUID="974bff7e-6bfc-49c2-9d3d-831d1bf5385d" Dec 05 23:34:48 crc kubenswrapper[4734]: I1205 23:34:48.541314 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-metrics-certs\") pod \"openstack-operator-controller-manager-5845f76896-vhzwq\" (UID: \"2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e\") " pod="openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq" Dec 05 23:34:48 crc kubenswrapper[4734]: I1205 23:34:48.541472 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-webhook-certs\") pod \"openstack-operator-controller-manager-5845f76896-vhzwq\" (UID: \"2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e\") " pod="openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq" Dec 05 23:34:48 crc kubenswrapper[4734]: E1205 23:34:48.541743 4734 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 23:34:48 crc kubenswrapper[4734]: E1205 23:34:48.541791 4734 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 23:34:48 crc kubenswrapper[4734]: E1205 23:34:48.541853 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-webhook-certs podName:2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e nodeName:}" failed. No retries permitted until 2025-12-05 23:34:50.541834165 +0000 UTC m=+911.225238441 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-webhook-certs") pod "openstack-operator-controller-manager-5845f76896-vhzwq" (UID: "2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e") : secret "webhook-server-cert" not found Dec 05 23:34:48 crc kubenswrapper[4734]: E1205 23:34:48.541887 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-metrics-certs podName:2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e nodeName:}" failed. No retries permitted until 2025-12-05 23:34:50.541862925 +0000 UTC m=+911.225267291 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-metrics-certs") pod "openstack-operator-controller-manager-5845f76896-vhzwq" (UID: "2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e") : secret "metrics-server-cert" not found Dec 05 23:34:48 crc kubenswrapper[4734]: I1205 23:34:48.608988 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-w8l4b" event={"ID":"ad6bda6e-964f-44c3-b759-ad151097b4f1","Type":"ContainerStarted","Data":"c1aa2ca076ee09c38a3de1904893cb058681c959e39ddf547bc2a7b9e319de8d"} Dec 05 23:34:48 crc kubenswrapper[4734]: I1205 23:34:48.610569 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zx66z" event={"ID":"58aa2c14-9374-45b1-b6dd-07e849f23306","Type":"ContainerStarted","Data":"fa382185172f23d802f6b176212f721f7206a916ee09fe9166609cafc06bdbe8"} Dec 05 23:34:48 crc kubenswrapper[4734]: I1205 23:34:48.612392 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-lwpjm" event={"ID":"9883e2bb-76f7-476d-8a74-e358ebf37ed2","Type":"ContainerStarted","Data":"507b836bd919c4ce2432bb856fba3b8590bf6a25bb8b46719f4d25bc13f3793e"} Dec 05 23:34:48 crc kubenswrapper[4734]: I1205 23:34:48.615932 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2fm4z" event={"ID":"696f07ba-7c46-41f2-826f-890756824285","Type":"ContainerStarted","Data":"b89238b7a4ab8a0cfb88e7912616fc1b61ede739fbd30e46af1ac9709fa16858"} Dec 05 23:34:48 crc kubenswrapper[4734]: I1205 23:34:48.618087 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-rw8vg" event={"ID":"ef794353-3292-4809-94d8-105aaa36889e","Type":"ContainerStarted","Data":"4d74926a23e4b2fbac99afdd0dbc6081015ae7927afc6567df3a2249dc8c18c6"} Dec 05 23:34:48 crc kubenswrapper[4734]: E1205 23:34:48.618243 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2fm4z" podUID="696f07ba-7c46-41f2-826f-890756824285" Dec 05 23:34:48 crc kubenswrapper[4734]: I1205 23:34:48.626499 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-fwbcd" event={"ID":"157817be-876f-4157-87af-6ef317b91cb9","Type":"ContainerStarted","Data":"b7acad96ad1b3d8078da72052a1ee79368a76c6699b540770695e8579f0505c7"} Dec 05 23:34:48 crc kubenswrapper[4734]: I1205 23:34:48.649917 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4bt9z" event={"ID":"43c8ec4c-96f9-47f0-9313-2813ea1c62c2","Type":"ContainerStarted","Data":"9eadf35f33da33e2644629daa18cb7b4bd455718ef29ba0a2d1e967e8b555a89"} Dec 05 23:34:48 crc kubenswrapper[4734]: I1205 23:34:48.661850 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-rp9j7" event={"ID":"c12a23f4-fdd7-455e-b74c-f757f15990ca","Type":"ContainerStarted","Data":"b2271d0cd034a831cf94c043b6d5e99962af796f3f9c9bd1ca69b5f0b487dcfb"} Dec 05 23:34:48 crc kubenswrapper[4734]: I1205 23:34:48.667146 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-t5gsd" event={"ID":"3255ef71-c5a8-4fef-a1ab-dc2107c710eb","Type":"ContainerStarted","Data":"04c0e9374b292a674fa10047a0e4d2c18dac1968f6c284586d66f7b9831e8371"} Dec 05 23:34:48 crc kubenswrapper[4734]: I1205 23:34:48.668465 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-cdtjx" event={"ID":"2050fd66-c55a-4048-a869-cb786b5f0d2b","Type":"ContainerStarted","Data":"60a39dcd866690f15f23b7b87245eae55133748ddfb696cbdd1a5da74f7a2c90"} Dec 05 23:34:48 crc kubenswrapper[4734]: I1205 23:34:48.670443 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-xwqfj" event={"ID":"b7ee6df9-99e2-480d-aa84-7618ff0cda2f","Type":"ContainerStarted","Data":"2dedc2e4ec441ab021abca03c8512e1bbcc3fa31e424da3bf3dcc22cc24abf41"} Dec 05 23:34:48 crc kubenswrapper[4734]: I1205 23:34:48.672719 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hdt6h" event={"ID":"ea29b614-e490-4a3e-925e-d9f6c56b0c35","Type":"ContainerStarted","Data":"ca6505f6930fa39eed5528aa47e9792b66d7a252ab96a92fc14642f194c81ecb"} Dec 05 23:34:48 crc kubenswrapper[4734]: E1205 23:34:48.688938 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hdt6h" podUID="ea29b614-e490-4a3e-925e-d9f6c56b0c35" Dec 05 23:34:48 crc kubenswrapper[4734]: I1205 23:34:48.693214 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-bf28l" event={"ID":"3ab5c543-f1e6-455c-a051-7940ffcc833d","Type":"ContainerStarted","Data":"ddf52ef05c161cd8c30f49352a52377deb6d20fc2c105df52d4d78d19b72df4d"} Dec 05 23:34:48 crc kubenswrapper[4734]: E1205 23:34:48.707548 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-bf28l" podUID="3ab5c543-f1e6-455c-a051-7940ffcc833d" Dec 05 23:34:48 crc kubenswrapper[4734]: I1205 23:34:48.707632 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-c7c94" event={"ID":"4685a9c2-ef1c-462d-848c-fbbea6a8ebfe","Type":"ContainerStarted","Data":"db00a9517993e4d98387ad874b62fe9bf782563a6610deaa28652fd26ef387c6"} Dec 05 23:34:48 crc kubenswrapper[4734]: I1205 23:34:48.716387 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-j4m2j" event={"ID":"974bff7e-6bfc-49c2-9d3d-831d1bf5385d","Type":"ContainerStarted","Data":"5c75d13ed3e4557e6163cb6324a9f5a2b9e73306a33ebbdd76dddf67545c3eac"} Dec 05 23:34:48 crc kubenswrapper[4734]: E1205 23:34:48.723044 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-j4m2j" podUID="974bff7e-6bfc-49c2-9d3d-831d1bf5385d" Dec 05 23:34:48 crc kubenswrapper[4734]: I1205 23:34:48.751610 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-wf4vr" event={"ID":"aa5ccaa9-5087-4891-b255-a5135271a2a5","Type":"ContainerStarted","Data":"e149165a853c7613626e72bc16ea10737879c234cfac04b2c4704399540c0415"} Dec 05 23:34:48 crc kubenswrapper[4734]: I1205 23:34:48.756929 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nc6wd" event={"ID":"608bca6a-1cb5-44b9-91c6-32a77372a4e5","Type":"ContainerStarted","Data":"8d408e65a71eacd1b5c6db5cfdd1935663c02fa58eead439f2d366bfc94d2842"} Dec 05 23:34:48 crc kubenswrapper[4734]: E1205 23:34:48.757441 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-wf4vr" podUID="aa5ccaa9-5087-4891-b255-a5135271a2a5" Dec 05 23:34:49 crc kubenswrapper[4734]: E1205 23:34:49.793443 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-wf4vr" podUID="aa5ccaa9-5087-4891-b255-a5135271a2a5" Dec 05 23:34:49 crc kubenswrapper[4734]: E1205 23:34:49.793566 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-bf28l" podUID="3ab5c543-f1e6-455c-a051-7940ffcc833d" Dec 05 23:34:49 crc kubenswrapper[4734]: E1205 23:34:49.793629 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hdt6h" podUID="ea29b614-e490-4a3e-925e-d9f6c56b0c35" Dec 05 23:34:49 crc kubenswrapper[4734]: E1205 23:34:49.793707 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2fm4z" podUID="696f07ba-7c46-41f2-826f-890756824285" Dec 05 23:34:49 crc kubenswrapper[4734]: E1205 23:34:49.793697 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-j4m2j" podUID="974bff7e-6bfc-49c2-9d3d-831d1bf5385d" Dec 05 23:34:49 crc kubenswrapper[4734]: I1205 23:34:49.864761 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df5aaec7-4487-47a1-98c4-0206d0ecf7f4-cert\") pod \"infra-operator-controller-manager-78d48bff9d-r2427\" (UID: \"df5aaec7-4487-47a1-98c4-0206d0ecf7f4\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-r2427" Dec 05 23:34:49 crc kubenswrapper[4734]: E1205 23:34:49.864986 4734 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 23:34:49 crc kubenswrapper[4734]: E1205 23:34:49.865032 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df5aaec7-4487-47a1-98c4-0206d0ecf7f4-cert podName:df5aaec7-4487-47a1-98c4-0206d0ecf7f4 nodeName:}" failed. No retries permitted until 2025-12-05 23:34:53.86501617 +0000 UTC m=+914.548420446 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df5aaec7-4487-47a1-98c4-0206d0ecf7f4-cert") pod "infra-operator-controller-manager-78d48bff9d-r2427" (UID: "df5aaec7-4487-47a1-98c4-0206d0ecf7f4") : secret "infra-operator-webhook-server-cert" not found Dec 05 23:34:50 crc kubenswrapper[4734]: I1205 23:34:50.172719 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cce8abe-4425-4cea-ac4f-3fd707bd5737-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fhnqmj\" (UID: \"9cce8abe-4425-4cea-ac4f-3fd707bd5737\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fhnqmj" Dec 05 23:34:50 crc kubenswrapper[4734]: E1205 23:34:50.172976 4734 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 23:34:50 crc kubenswrapper[4734]: E1205 23:34:50.173040 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cce8abe-4425-4cea-ac4f-3fd707bd5737-cert podName:9cce8abe-4425-4cea-ac4f-3fd707bd5737 nodeName:}" failed. No retries permitted until 2025-12-05 23:34:54.173020977 +0000 UTC m=+914.856425253 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9cce8abe-4425-4cea-ac4f-3fd707bd5737-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fhnqmj" (UID: "9cce8abe-4425-4cea-ac4f-3fd707bd5737") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 23:34:50 crc kubenswrapper[4734]: I1205 23:34:50.579680 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-metrics-certs\") pod \"openstack-operator-controller-manager-5845f76896-vhzwq\" (UID: \"2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e\") " pod="openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq" Dec 05 23:34:50 crc kubenswrapper[4734]: E1205 23:34:50.579944 4734 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 23:34:50 crc kubenswrapper[4734]: E1205 23:34:50.580024 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-metrics-certs podName:2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e nodeName:}" failed. No retries permitted until 2025-12-05 23:34:54.580006331 +0000 UTC m=+915.263410607 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-metrics-certs") pod "openstack-operator-controller-manager-5845f76896-vhzwq" (UID: "2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e") : secret "metrics-server-cert" not found Dec 05 23:34:50 crc kubenswrapper[4734]: I1205 23:34:50.581593 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-webhook-certs\") pod \"openstack-operator-controller-manager-5845f76896-vhzwq\" (UID: \"2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e\") " pod="openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq" Dec 05 23:34:50 crc kubenswrapper[4734]: E1205 23:34:50.581773 4734 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 23:34:50 crc kubenswrapper[4734]: E1205 23:34:50.581882 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-webhook-certs podName:2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e nodeName:}" failed. No retries permitted until 2025-12-05 23:34:54.581870856 +0000 UTC m=+915.265275132 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-webhook-certs") pod "openstack-operator-controller-manager-5845f76896-vhzwq" (UID: "2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e") : secret "webhook-server-cert" not found Dec 05 23:34:53 crc kubenswrapper[4734]: I1205 23:34:53.941551 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df5aaec7-4487-47a1-98c4-0206d0ecf7f4-cert\") pod \"infra-operator-controller-manager-78d48bff9d-r2427\" (UID: \"df5aaec7-4487-47a1-98c4-0206d0ecf7f4\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-r2427" Dec 05 23:34:53 crc kubenswrapper[4734]: E1205 23:34:53.941794 4734 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 23:34:53 crc kubenswrapper[4734]: E1205 23:34:53.942103 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df5aaec7-4487-47a1-98c4-0206d0ecf7f4-cert podName:df5aaec7-4487-47a1-98c4-0206d0ecf7f4 nodeName:}" failed. No retries permitted until 2025-12-05 23:35:01.942082291 +0000 UTC m=+922.625486567 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df5aaec7-4487-47a1-98c4-0206d0ecf7f4-cert") pod "infra-operator-controller-manager-78d48bff9d-r2427" (UID: "df5aaec7-4487-47a1-98c4-0206d0ecf7f4") : secret "infra-operator-webhook-server-cert" not found Dec 05 23:34:54 crc kubenswrapper[4734]: I1205 23:34:54.246727 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cce8abe-4425-4cea-ac4f-3fd707bd5737-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fhnqmj\" (UID: \"9cce8abe-4425-4cea-ac4f-3fd707bd5737\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fhnqmj" Dec 05 23:34:54 crc kubenswrapper[4734]: E1205 23:34:54.246967 4734 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 23:34:54 crc kubenswrapper[4734]: E1205 23:34:54.247255 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cce8abe-4425-4cea-ac4f-3fd707bd5737-cert podName:9cce8abe-4425-4cea-ac4f-3fd707bd5737 nodeName:}" failed. No retries permitted until 2025-12-05 23:35:02.247232439 +0000 UTC m=+922.930636715 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9cce8abe-4425-4cea-ac4f-3fd707bd5737-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fhnqmj" (UID: "9cce8abe-4425-4cea-ac4f-3fd707bd5737") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 23:34:54 crc kubenswrapper[4734]: I1205 23:34:54.654568 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-webhook-certs\") pod \"openstack-operator-controller-manager-5845f76896-vhzwq\" (UID: \"2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e\") " pod="openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq" Dec 05 23:34:54 crc kubenswrapper[4734]: I1205 23:34:54.655159 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-metrics-certs\") pod \"openstack-operator-controller-manager-5845f76896-vhzwq\" (UID: \"2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e\") " pod="openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq" Dec 05 23:34:54 crc kubenswrapper[4734]: E1205 23:34:54.655365 4734 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 23:34:54 crc kubenswrapper[4734]: E1205 23:34:54.655442 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-metrics-certs podName:2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e nodeName:}" failed. No retries permitted until 2025-12-05 23:35:02.655417852 +0000 UTC m=+923.338822138 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-metrics-certs") pod "openstack-operator-controller-manager-5845f76896-vhzwq" (UID: "2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e") : secret "metrics-server-cert" not found Dec 05 23:34:54 crc kubenswrapper[4734]: E1205 23:34:54.655893 4734 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 23:34:54 crc kubenswrapper[4734]: E1205 23:34:54.655923 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-webhook-certs podName:2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e nodeName:}" failed. No retries permitted until 2025-12-05 23:35:02.655913984 +0000 UTC m=+923.339318260 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-webhook-certs") pod "openstack-operator-controller-manager-5845f76896-vhzwq" (UID: "2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e") : secret "webhook-server-cert" not found Dec 05 23:35:01 crc kubenswrapper[4734]: E1205 23:35:01.627913 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:900050d3501c0785b227db34b89883efe68247816e5c7427cacb74f8aa10605a" Dec 05 23:35:01 crc kubenswrapper[4734]: E1205 23:35:01.628943 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:900050d3501c0785b227db34b89883efe68247816e5c7427cacb74f8aa10605a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jsjxz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-697fb699cf-fwbcd_openstack-operators(157817be-876f-4157-87af-6ef317b91cb9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 23:35:01 crc kubenswrapper[4734]: I1205 23:35:01.990134 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df5aaec7-4487-47a1-98c4-0206d0ecf7f4-cert\") pod \"infra-operator-controller-manager-78d48bff9d-r2427\" (UID: \"df5aaec7-4487-47a1-98c4-0206d0ecf7f4\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-r2427" Dec 05 23:35:02 crc kubenswrapper[4734]: I1205 23:35:02.000461 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df5aaec7-4487-47a1-98c4-0206d0ecf7f4-cert\") pod \"infra-operator-controller-manager-78d48bff9d-r2427\" (UID: \"df5aaec7-4487-47a1-98c4-0206d0ecf7f4\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-r2427" Dec 05 23:35:02 crc kubenswrapper[4734]: I1205 23:35:02.220454 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-r2427" Dec 05 23:35:02 crc kubenswrapper[4734]: E1205 23:35:02.262087 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 05 23:35:02 crc kubenswrapper[4734]: E1205 23:35:02.262344 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bg7r2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-zx66z_openstack-operators(58aa2c14-9374-45b1-b6dd-07e849f23306): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 23:35:02 crc kubenswrapper[4734]: I1205 23:35:02.300493 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cce8abe-4425-4cea-ac4f-3fd707bd5737-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fhnqmj\" (UID: \"9cce8abe-4425-4cea-ac4f-3fd707bd5737\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fhnqmj" Dec 05 23:35:02 crc kubenswrapper[4734]: E1205 23:35:02.300773 4734 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 23:35:02 crc kubenswrapper[4734]: E1205 23:35:02.300940 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cce8abe-4425-4cea-ac4f-3fd707bd5737-cert podName:9cce8abe-4425-4cea-ac4f-3fd707bd5737 nodeName:}" failed. No retries permitted until 2025-12-05 23:35:18.300906147 +0000 UTC m=+938.984310413 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9cce8abe-4425-4cea-ac4f-3fd707bd5737-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fhnqmj" (UID: "9cce8abe-4425-4cea-ac4f-3fd707bd5737") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 23:35:02 crc kubenswrapper[4734]: I1205 23:35:02.712446 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-webhook-certs\") pod \"openstack-operator-controller-manager-5845f76896-vhzwq\" (UID: \"2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e\") " pod="openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq" Dec 05 23:35:02 crc kubenswrapper[4734]: E1205 23:35:02.712673 4734 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 23:35:02 crc kubenswrapper[4734]: I1205 23:35:02.713010 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-metrics-certs\") pod \"openstack-operator-controller-manager-5845f76896-vhzwq\" (UID: \"2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e\") " pod="openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq" Dec 05 23:35:02 crc kubenswrapper[4734]: E1205 23:35:02.713049 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-webhook-certs podName:2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e nodeName:}" failed. No retries permitted until 2025-12-05 23:35:18.713027224 +0000 UTC m=+939.396431500 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-webhook-certs") pod "openstack-operator-controller-manager-5845f76896-vhzwq" (UID: "2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e") : secret "webhook-server-cert" not found Dec 05 23:35:02 crc kubenswrapper[4734]: E1205 23:35:02.714022 4734 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 23:35:02 crc kubenswrapper[4734]: E1205 23:35:02.714090 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-metrics-certs podName:2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e nodeName:}" failed. No retries permitted until 2025-12-05 23:35:18.714057229 +0000 UTC m=+939.397461505 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-metrics-certs") pod "openstack-operator-controller-manager-5845f76896-vhzwq" (UID: "2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e") : secret "metrics-server-cert" not found Dec 05 23:35:03 crc kubenswrapper[4734]: E1205 23:35:03.068789 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:5370dc4a8e776923eec00bb50cbdb2e390e9dde50be26bdc04a216bd2d6b5027" Dec 05 23:35:03 crc kubenswrapper[4734]: E1205 23:35:03.069031 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:5370dc4a8e776923eec00bb50cbdb2e390e9dde50be26bdc04a216bd2d6b5027,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jq99m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-5697bb5779-rp9j7_openstack-operators(c12a23f4-fdd7-455e-b74c-f757f15990ca): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 23:35:03 crc kubenswrapper[4734]: E1205 23:35:03.863516 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:6b3e0302608a2e70f9b5ae9167f6fbf59264f226d9db99d48f70466ab2f216b8" Dec 05 23:35:03 crc kubenswrapper[4734]: E1205 23:35:03.863811 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:6b3e0302608a2e70f9b5ae9167f6fbf59264f226d9db99d48f70466ab2f216b8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q67qq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-667bd8d554-xwqfj_openstack-operators(b7ee6df9-99e2-480d-aa84-7618ff0cda2f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 23:35:04 crc kubenswrapper[4734]: E1205 23:35:04.637238 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991" Dec 05 23:35:04 crc kubenswrapper[4734]: E1205 23:35:04.637805 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w4h8q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9d58d64bc-cdtjx_openstack-operators(2050fd66-c55a-4048-a869-cb786b5f0d2b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 23:35:05 crc kubenswrapper[4734]: E1205 23:35:05.422903 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87" Dec 05 23:35:05 crc kubenswrapper[4734]: E1205 23:35:05.423565 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wtv7s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-967d97867-rw8vg_openstack-operators(ef794353-3292-4809-94d8-105aaa36889e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 23:35:06 crc kubenswrapper[4734]: E1205 23:35:06.127042 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:981b6a8f95934a86c5f10ef6e198b07265aeba7f11cf84b9ccd13dfaf06f3ca3" Dec 05 23:35:06 crc kubenswrapper[4734]: E1205 23:35:06.127330 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:981b6a8f95934a86c5f10ef6e198b07265aeba7f11cf84b9ccd13dfaf06f3ca3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t42c7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-6c677c69b-4clqb_openstack-operators(6ba0bb79-4132-4bd9-a2ce-c8a9b516402d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 23:35:06 crc kubenswrapper[4734]: E1205 23:35:06.709011 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f" Dec 05 23:35:06 crc kubenswrapper[4734]: E1205 23:35:06.709237 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vvnzt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-w8l4b_openstack-operators(ad6bda6e-964f-44c3-b759-ad151097b4f1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 23:35:07 crc kubenswrapper[4734]: E1205 23:35:07.386300 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:424da951f13f1fbe9083215dc9f5088f90676dd813f01fdf3c1a8639b61cbaad" Dec 05 23:35:07 crc kubenswrapper[4734]: E1205 23:35:07.386624 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:424da951f13f1fbe9083215dc9f5088f90676dd813f01fdf3c1a8639b61cbaad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lsk7f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-79c8c4686c-nc6wd_openstack-operators(608bca6a-1cb5-44b9-91c6-32a77372a4e5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 23:35:09 crc kubenswrapper[4734]: E1205 23:35:09.096024 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 05 23:35:09 crc kubenswrapper[4734]: E1205 23:35:09.096242 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sgr4f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-t5gsd_openstack-operators(3255ef71-c5a8-4fef-a1ab-dc2107c710eb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 23:35:13 crc kubenswrapper[4734]: E1205 23:35:13.191186 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 05 23:35:13 crc kubenswrapper[4734]: E1205 23:35:13.192751 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m2r6d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-4bt9z_openstack-operators(43c8ec4c-96f9-47f0-9313-2813ea1c62c2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 23:35:13 crc kubenswrapper[4734]: E1205 23:35:13.194869 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4bt9z" podUID="43c8ec4c-96f9-47f0-9313-2813ea1c62c2" Dec 05 23:35:14 crc kubenswrapper[4734]: E1205 23:35:14.035293 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4bt9z" podUID="43c8ec4c-96f9-47f0-9313-2813ea1c62c2" Dec 05 23:35:14 crc kubenswrapper[4734]: I1205 23:35:14.240203 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-r2427"] Dec 05 23:35:15 crc kubenswrapper[4734]: I1205 23:35:15.049432 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-r2427" event={"ID":"df5aaec7-4487-47a1-98c4-0206d0ecf7f4","Type":"ContainerStarted","Data":"7b7c022788760c4379722d26316ac14dc201c359e78589090873efcdb6df5240"} Dec 05 23:35:15 crc kubenswrapper[4734]: I1205 23:35:15.052400 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-bl5vb" event={"ID":"4ac00d0e-d1c1-44d8-869d-1d98f5a137e0","Type":"ContainerStarted","Data":"aa3f628e89d1491b488d698a4bddedb31e1a0ae2c9c3cc13e600f9a7be2cc95b"} Dec 05 23:35:15 crc kubenswrapper[4734]: I1205 23:35:15.056724 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mnmlv" event={"ID":"9a792918-0311-4b1b-8920-a315370ecba7","Type":"ContainerStarted","Data":"fd108b7efd3269431724b6d28f6bbef26e4bf3088b8bf9741cc5167dd94819a2"} Dec 05 23:35:15 crc kubenswrapper[4734]: I1205 23:35:15.058946 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-c7c94" event={"ID":"4685a9c2-ef1c-462d-848c-fbbea6a8ebfe","Type":"ContainerStarted","Data":"e5a639304dc6f8b3071a0a2506b1494354118663d4c52d67346500be23ded095"} Dec 05 23:35:15 crc kubenswrapper[4734]: I1205 23:35:15.060285 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-lwpjm" event={"ID":"9883e2bb-76f7-476d-8a74-e358ebf37ed2","Type":"ContainerStarted","Data":"a48f72a20ec8337b5f42a08384bbd416526e57c009dadd68f90158d7c451347d"} Dec 05 23:35:16 crc kubenswrapper[4734]: E1205 23:35:16.970796 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zx66z" podUID="58aa2c14-9374-45b1-b6dd-07e849f23306" Dec 05 23:35:17 crc kubenswrapper[4734]: E1205 23:35:17.023949 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-4clqb" podUID="6ba0bb79-4132-4bd9-a2ce-c8a9b516402d" Dec 05 23:35:17 crc kubenswrapper[4734]: I1205 23:35:17.088998 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-wf4vr" event={"ID":"aa5ccaa9-5087-4891-b255-a5135271a2a5","Type":"ContainerStarted","Data":"3976cf149178115818fcb02aacd420b9817653ca58a943efaf2b671a4f236900"} Dec 05 23:35:17 crc kubenswrapper[4734]: I1205 23:35:17.090284 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hdt6h" event={"ID":"ea29b614-e490-4a3e-925e-d9f6c56b0c35","Type":"ContainerStarted","Data":"891118c23be324b4a4821a32c5d5c70544f4e8342a4b3e5b2a3cd7fbd9b59b1f"} Dec 05 23:35:17 crc kubenswrapper[4734]: I1205 23:35:17.091298 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zx66z" event={"ID":"58aa2c14-9374-45b1-b6dd-07e849f23306","Type":"ContainerStarted","Data":"2617eefe484cc650e2a4c5211dbbd0c4531fb6817fe919df0bf6968200b46243"} Dec 05 23:35:17 crc kubenswrapper[4734]: E1205 23:35:17.093896 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-cdtjx" podUID="2050fd66-c55a-4048-a869-cb786b5f0d2b" Dec 05 23:35:17 crc kubenswrapper[4734]: E1205 23:35:17.096664 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-fwbcd" podUID="157817be-876f-4157-87af-6ef317b91cb9" Dec 05 23:35:17 crc kubenswrapper[4734]: E1205 23:35:17.097037 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-rp9j7" podUID="c12a23f4-fdd7-455e-b74c-f757f15990ca" Dec 05 23:35:17 crc kubenswrapper[4734]: I1205 23:35:17.148067 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-rp9j7" event={"ID":"c12a23f4-fdd7-455e-b74c-f757f15990ca","Type":"ContainerStarted","Data":"563de99fbb8427776fbe3f47be3150b76e5e96f8f090bd1aa8f97848862740b1"} Dec 05 23:35:17 crc kubenswrapper[4734]: I1205 23:35:17.187016 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2fm4z" event={"ID":"696f07ba-7c46-41f2-826f-890756824285","Type":"ContainerStarted","Data":"ed744c218a03ee9a90408840b7422673bf988563488eac141c0cb0ded7ed5563"} Dec 05 23:35:17 crc kubenswrapper[4734]: I1205 23:35:17.205577 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-cdtjx" event={"ID":"2050fd66-c55a-4048-a869-cb786b5f0d2b","Type":"ContainerStarted","Data":"b8fa5342a7e2cb56622be74c2f32b31d00681cbfb0fab12f033012383a928899"} Dec 05 23:35:17 crc kubenswrapper[4734]: E1205 23:35:17.223781 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-w8l4b" podUID="ad6bda6e-964f-44c3-b759-ad151097b4f1" Dec 05 23:35:17 crc kubenswrapper[4734]: I1205 23:35:17.230078 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-j4m2j" event={"ID":"974bff7e-6bfc-49c2-9d3d-831d1bf5385d","Type":"ContainerStarted","Data":"ed6e4e5e18d0454ccc054615d07764c45222f852e690bd509df2fde7bab74b0c"} Dec 05 23:35:17 crc kubenswrapper[4734]: I1205 23:35:17.259896 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-bf28l" event={"ID":"3ab5c543-f1e6-455c-a051-7940ffcc833d","Type":"ContainerStarted","Data":"b2a41cc2fa8b6ed296aa901a92c7c3d5fa62caacfa6218056c3cce8e3d338327"} Dec 05 23:35:17 crc kubenswrapper[4734]: I1205 23:35:17.292412 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-4clqb" event={"ID":"6ba0bb79-4132-4bd9-a2ce-c8a9b516402d","Type":"ContainerStarted","Data":"9565d308edd142ad59eeb980e907a2ee4709c3c7fc22bc65253e5583f54cfbdb"} Dec 05 23:35:17 crc kubenswrapper[4734]: E1205 23:35:17.299993 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-t5gsd" podUID="3255ef71-c5a8-4fef-a1ab-dc2107c710eb" Dec 05 23:35:17 crc kubenswrapper[4734]: I1205 23:35:17.314289 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-fwbcd" event={"ID":"157817be-876f-4157-87af-6ef317b91cb9","Type":"ContainerStarted","Data":"03c64380658a068c771c53efe7e2ca15556639ba242579e45520eb7cb84edaf6"} Dec 05 23:35:18 crc kubenswrapper[4734]: I1205 23:35:18.328028 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cce8abe-4425-4cea-ac4f-3fd707bd5737-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fhnqmj\" (UID: \"9cce8abe-4425-4cea-ac4f-3fd707bd5737\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fhnqmj" Dec 05 23:35:18 crc kubenswrapper[4734]: I1205 23:35:18.336241 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-lwpjm" event={"ID":"9883e2bb-76f7-476d-8a74-e358ebf37ed2","Type":"ContainerStarted","Data":"ded2543df0acfe0dcdbee36defd5f5999ad9ad66277f159c27aed4ac308b163b"} Dec 05 23:35:18 crc kubenswrapper[4734]: I1205 23:35:18.336662 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-lwpjm" Dec 05 23:35:18 crc kubenswrapper[4734]: I1205 23:35:18.337600 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-t5gsd" event={"ID":"3255ef71-c5a8-4fef-a1ab-dc2107c710eb","Type":"ContainerStarted","Data":"db4c52e58ad082fd66b86f57a56666a4489afdde088c9bd446e32279c333d960"} Dec 05 23:35:18 crc kubenswrapper[4734]: I1205 23:35:18.339930 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-w8l4b" event={"ID":"ad6bda6e-964f-44c3-b759-ad151097b4f1","Type":"ContainerStarted","Data":"f3171a91cd6862f74b6761b97d500af7f2e9970decdc4283532421384a20def6"} Dec 05 23:35:18 crc kubenswrapper[4734]: E1205 23:35:18.340270 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-t5gsd" podUID="3255ef71-c5a8-4fef-a1ab-dc2107c710eb" Dec 05 23:35:18 crc kubenswrapper[4734]: I1205 23:35:18.342257 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-bl5vb" event={"ID":"4ac00d0e-d1c1-44d8-869d-1d98f5a137e0","Type":"ContainerStarted","Data":"12f32534350bd61af4dfc2f3d341f1bf06a5177f4b36396b3479b79baae0dcba"} Dec 05 23:35:18 crc kubenswrapper[4734]: I1205 23:35:18.342783 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-bl5vb" Dec 05 23:35:18 crc kubenswrapper[4734]: I1205 23:35:18.344016 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cce8abe-4425-4cea-ac4f-3fd707bd5737-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fhnqmj\" (UID: \"9cce8abe-4425-4cea-ac4f-3fd707bd5737\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fhnqmj" Dec 05 23:35:18 crc kubenswrapper[4734]: I1205 23:35:18.347571 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-bf28l" event={"ID":"3ab5c543-f1e6-455c-a051-7940ffcc833d","Type":"ContainerStarted","Data":"201d3ebf2330e04e11f79ab75822751ddd8abc646c293499333d9a353ce3f810"} Dec 05 23:35:18 crc kubenswrapper[4734]: I1205 23:35:18.348282 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-bf28l" Dec 05 23:35:18 crc kubenswrapper[4734]: I1205 23:35:18.364303 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-c7c94" event={"ID":"4685a9c2-ef1c-462d-848c-fbbea6a8ebfe","Type":"ContainerStarted","Data":"0fcf06250d854669c421bb23868d71f01a15e9bef1cd8c3353d09528df870679"} Dec 05 23:35:18 crc kubenswrapper[4734]: I1205 23:35:18.364379 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-c7c94" Dec 05 23:35:18 crc kubenswrapper[4734]: I1205 23:35:18.377985 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-lwpjm" podStartSLOduration=4.720644395 podStartE2EDuration="33.37796017s" podCreationTimestamp="2025-12-05 23:34:45 +0000 UTC" firstStartedPulling="2025-12-05 23:34:47.847515754 +0000 UTC m=+908.530920060" lastFinishedPulling="2025-12-05 23:35:16.504831559 +0000 UTC m=+937.188235835" observedRunningTime="2025-12-05 23:35:18.360870405 +0000 UTC m=+939.044274681" watchObservedRunningTime="2025-12-05 23:35:18.37796017 +0000 UTC m=+939.061364456" Dec 05 23:35:18 crc kubenswrapper[4734]: I1205 23:35:18.378906 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-bf28l" podStartSLOduration=6.943518913 podStartE2EDuration="32.378900642s" podCreationTimestamp="2025-12-05 23:34:46 +0000 UTC" firstStartedPulling="2025-12-05 23:34:48.329234057 +0000 UTC m=+909.012638333" lastFinishedPulling="2025-12-05 23:35:13.764615786 +0000 UTC m=+934.448020062" observedRunningTime="2025-12-05 23:35:18.375823308 +0000 UTC m=+939.059227584" watchObservedRunningTime="2025-12-05 23:35:18.378900642 +0000 UTC m=+939.062304918" Dec 05 23:35:18 crc kubenswrapper[4734]: I1205 23:35:18.442263 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-bl5vb" podStartSLOduration=4.010354241 podStartE2EDuration="33.442240905s" podCreationTimestamp="2025-12-05 23:34:45 +0000 UTC" firstStartedPulling="2025-12-05 23:34:47.057752647 +0000 UTC m=+907.741156923" lastFinishedPulling="2025-12-05 23:35:16.489639311 +0000 UTC m=+937.173043587" observedRunningTime="2025-12-05 23:35:18.439640163 +0000 UTC m=+939.123044439" watchObservedRunningTime="2025-12-05 23:35:18.442240905 +0000 UTC m=+939.125645181" Dec 05 23:35:18 crc kubenswrapper[4734]: I1205 23:35:18.480775 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-c7c94" podStartSLOduration=4.368475259 podStartE2EDuration="33.480157014s" podCreationTimestamp="2025-12-05 23:34:45 +0000 UTC" firstStartedPulling="2025-12-05 23:34:47.653941408 +0000 UTC m=+908.337345684" lastFinishedPulling="2025-12-05 23:35:16.765623173 +0000 UTC m=+937.449027439" observedRunningTime="2025-12-05 23:35:18.477355076 +0000 UTC m=+939.160759352" watchObservedRunningTime="2025-12-05 23:35:18.480157014 +0000 UTC m=+939.163561290" Dec 05 23:35:18 crc kubenswrapper[4734]: I1205 23:35:18.508415 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fhnqmj" Dec 05 23:35:18 crc kubenswrapper[4734]: I1205 23:35:18.740216 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-metrics-certs\") pod \"openstack-operator-controller-manager-5845f76896-vhzwq\" (UID: \"2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e\") " pod="openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq" Dec 05 23:35:18 crc kubenswrapper[4734]: I1205 23:35:18.740848 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-webhook-certs\") pod \"openstack-operator-controller-manager-5845f76896-vhzwq\" (UID: \"2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e\") " pod="openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq" Dec 05 23:35:18 crc kubenswrapper[4734]: I1205 23:35:18.747482 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-metrics-certs\") pod \"openstack-operator-controller-manager-5845f76896-vhzwq\" (UID: \"2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e\") " pod="openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq" Dec 05 23:35:18 crc kubenswrapper[4734]: I1205 23:35:18.747601 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e-webhook-certs\") pod \"openstack-operator-controller-manager-5845f76896-vhzwq\" (UID: \"2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e\") " pod="openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq" Dec 05 23:35:18 crc kubenswrapper[4734]: I1205 23:35:18.968147 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq" Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.012827 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fhnqmj"] Dec 05 23:35:19 crc kubenswrapper[4734]: W1205 23:35:19.029046 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cce8abe_4425_4cea_ac4f_3fd707bd5737.slice/crio-582045d59b401932fa05157d4986db7129cc4f8be712cde3cd3a32bd99a4543d WatchSource:0}: Error finding container 582045d59b401932fa05157d4986db7129cc4f8be712cde3cd3a32bd99a4543d: Status 404 returned error can't find the container with id 582045d59b401932fa05157d4986db7129cc4f8be712cde3cd3a32bd99a4543d Dec 05 23:35:19 crc kubenswrapper[4734]: E1205 23:35:19.152107 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-967d97867-rw8vg" podUID="ef794353-3292-4809-94d8-105aaa36889e" Dec 05 23:35:19 crc kubenswrapper[4734]: E1205 23:35:19.162221 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nc6wd" podUID="608bca6a-1cb5-44b9-91c6-32a77372a4e5" Dec 05 23:35:19 crc kubenswrapper[4734]: E1205 23:35:19.162820 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-xwqfj" podUID="b7ee6df9-99e2-480d-aa84-7618ff0cda2f" Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.393042 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-xwqfj" event={"ID":"b7ee6df9-99e2-480d-aa84-7618ff0cda2f","Type":"ContainerStarted","Data":"c60c0cf93192988b54294a5dd68dbe59762e8590deda9499598d2e654dfe9a03"} Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.471381 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-j4m2j" event={"ID":"974bff7e-6bfc-49c2-9d3d-831d1bf5385d","Type":"ContainerStarted","Data":"b332d7a6c376a0dd2eef1664dac05afa8c26cc8c24a67b8fa6b01d5495e4ad9a"} Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.471853 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-j4m2j" Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.474927 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nc6wd" event={"ID":"608bca6a-1cb5-44b9-91c6-32a77372a4e5","Type":"ContainerStarted","Data":"0b5731163f4b9944c7b29e2d425b04c3c62db20f36f07dc94bc0799804a7adf0"} Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.520167 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-j4m2j" podStartSLOduration=8.445530628 podStartE2EDuration="33.520136592s" podCreationTimestamp="2025-12-05 23:34:46 +0000 UTC" firstStartedPulling="2025-12-05 23:34:48.335688634 +0000 UTC m=+909.019092910" lastFinishedPulling="2025-12-05 23:35:13.410294598 +0000 UTC m=+934.093698874" observedRunningTime="2025-12-05 23:35:19.504063054 +0000 UTC m=+940.187467330" watchObservedRunningTime="2025-12-05 23:35:19.520136592 +0000 UTC m=+940.203540868" Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.581089 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hdt6h" event={"ID":"ea29b614-e490-4a3e-925e-d9f6c56b0c35","Type":"ContainerStarted","Data":"578db636d1a20ba7c28873a2227763af008e63af17f6b00c3d370f414f329ced"} Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.581602 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hdt6h" Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.586638 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mnmlv" event={"ID":"9a792918-0311-4b1b-8920-a315370ecba7","Type":"ContainerStarted","Data":"7d326a2ec54c69b34589602c6b771c3a84ac5e36b5e77964cb9f0e1d1c29d59c"} Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.587775 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mnmlv" Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.607117 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zx66z" event={"ID":"58aa2c14-9374-45b1-b6dd-07e849f23306","Type":"ContainerStarted","Data":"41b92741497b482ad14b5c6033505830bc4e7cffd2c17954f4f0d583b0e9e4c2"} Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.607597 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zx66z" Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.607907 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mnmlv" Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.657950 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hdt6h" podStartSLOduration=8.223709259 podStartE2EDuration="33.657923339s" podCreationTimestamp="2025-12-05 23:34:46 +0000 UTC" firstStartedPulling="2025-12-05 23:34:48.322808962 +0000 UTC m=+909.006213238" lastFinishedPulling="2025-12-05 23:35:13.757023031 +0000 UTC m=+934.440427318" observedRunningTime="2025-12-05 23:35:19.652339174 +0000 UTC m=+940.335743450" watchObservedRunningTime="2025-12-05 23:35:19.657923339 +0000 UTC m=+940.341327615" Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.673581 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-rw8vg" event={"ID":"ef794353-3292-4809-94d8-105aaa36889e","Type":"ContainerStarted","Data":"ea8adf087a20123bdfb66cb36b731352ff060ff026c2f6bf1c34951d5445ce5b"} Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.673787 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fhnqmj" event={"ID":"9cce8abe-4425-4cea-ac4f-3fd707bd5737","Type":"ContainerStarted","Data":"582045d59b401932fa05157d4986db7129cc4f8be712cde3cd3a32bd99a4543d"} Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.677734 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2fm4z" event={"ID":"696f07ba-7c46-41f2-826f-890756824285","Type":"ContainerStarted","Data":"5f942b3a942692ac53de3651de2f05535646852667c8a3ba92bb0e83a44600a7"} Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.677952 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2fm4z" Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.693841 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-cdtjx" event={"ID":"2050fd66-c55a-4048-a869-cb786b5f0d2b","Type":"ContainerStarted","Data":"4eec43688e1f0bd3a6a1a1d80714e56d6439ebbcf74194b15948ecdebcc85529"} Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.695459 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-cdtjx" Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.696146 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-mnmlv" podStartSLOduration=5.263528779 podStartE2EDuration="34.696131834s" podCreationTimestamp="2025-12-05 23:34:45 +0000 UTC" firstStartedPulling="2025-12-05 23:34:47.42313733 +0000 UTC m=+908.106541606" lastFinishedPulling="2025-12-05 23:35:16.855740385 +0000 UTC m=+937.539144661" observedRunningTime="2025-12-05 23:35:19.691726407 +0000 UTC m=+940.375130683" watchObservedRunningTime="2025-12-05 23:35:19.696131834 +0000 UTC m=+940.379536110" Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.708615 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-wf4vr" event={"ID":"aa5ccaa9-5087-4891-b255-a5135271a2a5","Type":"ContainerStarted","Data":"022f28a9da1c4898762e75c8e38ae225f72e172cc369fc16c9d0e13eca372f23"} Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.708657 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-wf4vr" Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.713220 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-bl5vb" Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.717216 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-lwpjm" Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.734372 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-c7c94" Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.796335 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq"] Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.873624 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zx66z" podStartSLOduration=4.863243318 podStartE2EDuration="34.873593031s" podCreationTimestamp="2025-12-05 23:34:45 +0000 UTC" firstStartedPulling="2025-12-05 23:34:47.8303842 +0000 UTC m=+908.513788476" lastFinishedPulling="2025-12-05 23:35:17.840733913 +0000 UTC m=+938.524138189" observedRunningTime="2025-12-05 23:35:19.866881758 +0000 UTC m=+940.550286034" watchObservedRunningTime="2025-12-05 23:35:19.873593031 +0000 UTC m=+940.556997307" Dec 05 23:35:19 crc kubenswrapper[4734]: I1205 23:35:19.933749 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-wf4vr" podStartSLOduration=9.816626074 podStartE2EDuration="34.933717866s" podCreationTimestamp="2025-12-05 23:34:45 +0000 UTC" firstStartedPulling="2025-12-05 23:34:48.293655236 +0000 UTC m=+908.977059512" lastFinishedPulling="2025-12-05 23:35:13.410747028 +0000 UTC m=+934.094151304" observedRunningTime="2025-12-05 23:35:19.930678493 +0000 UTC m=+940.614082769" watchObservedRunningTime="2025-12-05 23:35:19.933717866 +0000 UTC m=+940.617122142" Dec 05 23:35:20 crc kubenswrapper[4734]: I1205 23:35:19.994510 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2fm4z" podStartSLOduration=8.877868236 podStartE2EDuration="33.994485217s" podCreationTimestamp="2025-12-05 23:34:46 +0000 UTC" firstStartedPulling="2025-12-05 23:34:48.293681967 +0000 UTC m=+908.977086243" lastFinishedPulling="2025-12-05 23:35:13.410298938 +0000 UTC m=+934.093703224" observedRunningTime="2025-12-05 23:35:19.991363792 +0000 UTC m=+940.674768058" watchObservedRunningTime="2025-12-05 23:35:19.994485217 +0000 UTC m=+940.677889493" Dec 05 23:35:20 crc kubenswrapper[4734]: I1205 23:35:20.090702 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-cdtjx" podStartSLOduration=4.099567508 podStartE2EDuration="34.090682096s" podCreationTimestamp="2025-12-05 23:34:46 +0000 UTC" firstStartedPulling="2025-12-05 23:34:47.85105922 +0000 UTC m=+908.534463496" lastFinishedPulling="2025-12-05 23:35:17.842173808 +0000 UTC m=+938.525578084" observedRunningTime="2025-12-05 23:35:20.089223921 +0000 UTC m=+940.772628187" watchObservedRunningTime="2025-12-05 23:35:20.090682096 +0000 UTC m=+940.774086372" Dec 05 23:35:20 crc kubenswrapper[4734]: I1205 23:35:20.448200 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:35:20 crc kubenswrapper[4734]: I1205 23:35:20.448270 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:35:20 crc kubenswrapper[4734]: I1205 23:35:20.727781 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-fwbcd" event={"ID":"157817be-876f-4157-87af-6ef317b91cb9","Type":"ContainerStarted","Data":"0b33b9d185123d777c160606aaf0a0b478a1eec72213d07634542c4a2509b800"} Dec 05 23:35:20 crc kubenswrapper[4734]: I1205 23:35:20.729030 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-fwbcd" Dec 05 23:35:20 crc kubenswrapper[4734]: I1205 23:35:20.733134 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-rp9j7" event={"ID":"c12a23f4-fdd7-455e-b74c-f757f15990ca","Type":"ContainerStarted","Data":"4bc11d4c221b05a2efb801818208c50b7f137e3a98be7fdad37511ff0f2548dd"} Dec 05 23:35:20 crc kubenswrapper[4734]: I1205 23:35:20.733360 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-rp9j7" Dec 05 23:35:20 crc kubenswrapper[4734]: I1205 23:35:20.739255 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq" event={"ID":"2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e","Type":"ContainerStarted","Data":"a3558fb3f55c548cdc7b8e4f07b74603b6e8811fbedcf3eadaab03c37bf4430f"} Dec 05 23:35:20 crc kubenswrapper[4734]: I1205 23:35:20.739343 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq" event={"ID":"2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e","Type":"ContainerStarted","Data":"fc138970270892b2b2dca5ba74a824799485df3e798729caa8bbb2b8291f50b7"} Dec 05 23:35:20 crc kubenswrapper[4734]: I1205 23:35:20.740313 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq" Dec 05 23:35:20 crc kubenswrapper[4734]: I1205 23:35:20.742155 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-w8l4b" event={"ID":"ad6bda6e-964f-44c3-b759-ad151097b4f1","Type":"ContainerStarted","Data":"1dbc5f7c07888c2450337682e0a0ffb5bcfbfa98ee219398a70d9eed7bc51dcb"} Dec 05 23:35:20 crc kubenswrapper[4734]: I1205 23:35:20.742309 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-w8l4b" Dec 05 23:35:20 crc kubenswrapper[4734]: I1205 23:35:20.744549 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-4clqb" event={"ID":"6ba0bb79-4132-4bd9-a2ce-c8a9b516402d","Type":"ContainerStarted","Data":"840996d8a9e39bd7049df1be1aeee752b89c0b41102ae8e46a95f49243c43a69"} Dec 05 23:35:20 crc kubenswrapper[4734]: I1205 23:35:20.744595 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-4clqb" Dec 05 23:35:20 crc kubenswrapper[4734]: I1205 23:35:20.754517 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-fwbcd" podStartSLOduration=4.318200211 podStartE2EDuration="35.754497208s" podCreationTimestamp="2025-12-05 23:34:45 +0000 UTC" firstStartedPulling="2025-12-05 23:34:47.838444525 +0000 UTC m=+908.521848801" lastFinishedPulling="2025-12-05 23:35:19.274741522 +0000 UTC m=+939.958145798" observedRunningTime="2025-12-05 23:35:20.750136853 +0000 UTC m=+941.433541119" watchObservedRunningTime="2025-12-05 23:35:20.754497208 +0000 UTC m=+941.437901484" Dec 05 23:35:20 crc kubenswrapper[4734]: I1205 23:35:20.777377 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-rp9j7" podStartSLOduration=4.001004761 podStartE2EDuration="35.777358461s" podCreationTimestamp="2025-12-05 23:34:45 +0000 UTC" firstStartedPulling="2025-12-05 23:34:47.850198329 +0000 UTC m=+908.533602605" lastFinishedPulling="2025-12-05 23:35:19.626552029 +0000 UTC m=+940.309956305" observedRunningTime="2025-12-05 23:35:20.776569323 +0000 UTC m=+941.459973599" watchObservedRunningTime="2025-12-05 23:35:20.777358461 +0000 UTC m=+941.460762737" Dec 05 23:35:20 crc kubenswrapper[4734]: I1205 23:35:20.892334 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq" podStartSLOduration=34.892306834 podStartE2EDuration="34.892306834s" podCreationTimestamp="2025-12-05 23:34:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:35:20.836053232 +0000 UTC m=+941.519457508" watchObservedRunningTime="2025-12-05 23:35:20.892306834 +0000 UTC m=+941.575711110" Dec 05 23:35:20 crc kubenswrapper[4734]: I1205 23:35:20.913854 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-w8l4b" podStartSLOduration=3.899384382 podStartE2EDuration="34.913817375s" podCreationTimestamp="2025-12-05 23:34:46 +0000 UTC" firstStartedPulling="2025-12-05 23:34:48.292509628 +0000 UTC m=+908.975913904" lastFinishedPulling="2025-12-05 23:35:19.306942621 +0000 UTC m=+939.990346897" observedRunningTime="2025-12-05 23:35:20.889584648 +0000 UTC m=+941.572988934" watchObservedRunningTime="2025-12-05 23:35:20.913817375 +0000 UTC m=+941.597221651" Dec 05 23:35:20 crc kubenswrapper[4734]: I1205 23:35:20.930767 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-4clqb" podStartSLOduration=4.165102615 podStartE2EDuration="35.930732255s" podCreationTimestamp="2025-12-05 23:34:45 +0000 UTC" firstStartedPulling="2025-12-05 23:34:47.415926956 +0000 UTC m=+908.099331222" lastFinishedPulling="2025-12-05 23:35:19.181556586 +0000 UTC m=+939.864960862" observedRunningTime="2025-12-05 23:35:20.911941439 +0000 UTC m=+941.595345715" watchObservedRunningTime="2025-12-05 23:35:20.930732255 +0000 UTC m=+941.614136531" Dec 05 23:35:21 crc kubenswrapper[4734]: I1205 23:35:21.762109 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-rw8vg" event={"ID":"ef794353-3292-4809-94d8-105aaa36889e","Type":"ContainerStarted","Data":"d87092da3d5631577dc2325a35adb3f09e52b26120fb7c6f459ee02348da78cb"} Dec 05 23:35:21 crc kubenswrapper[4734]: I1205 23:35:21.764112 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-967d97867-rw8vg" Dec 05 23:35:21 crc kubenswrapper[4734]: I1205 23:35:21.784447 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-t5gsd" event={"ID":"3255ef71-c5a8-4fef-a1ab-dc2107c710eb","Type":"ContainerStarted","Data":"cfd9a41844555649d08933c067dc64c12ef7f53df48fb71d9a49a894c86dbf9d"} Dec 05 23:35:21 crc kubenswrapper[4734]: I1205 23:35:21.785549 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-t5gsd" Dec 05 23:35:21 crc kubenswrapper[4734]: I1205 23:35:21.791707 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-967d97867-rw8vg" podStartSLOduration=4.056131987 podStartE2EDuration="36.79168802s" podCreationTimestamp="2025-12-05 23:34:45 +0000 UTC" firstStartedPulling="2025-12-05 23:34:47.647881901 +0000 UTC m=+908.331286177" lastFinishedPulling="2025-12-05 23:35:20.383437934 +0000 UTC m=+941.066842210" observedRunningTime="2025-12-05 23:35:21.788030431 +0000 UTC m=+942.471434717" watchObservedRunningTime="2025-12-05 23:35:21.79168802 +0000 UTC m=+942.475092296" Dec 05 23:35:21 crc kubenswrapper[4734]: I1205 23:35:21.814065 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-xwqfj" event={"ID":"b7ee6df9-99e2-480d-aa84-7618ff0cda2f","Type":"ContainerStarted","Data":"79092feb7cbd5a92fe75641c5871336e2b278fbf92c085d4e23b3071b6137793"} Dec 05 23:35:21 crc kubenswrapper[4734]: I1205 23:35:21.815138 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-xwqfj" Dec 05 23:35:21 crc kubenswrapper[4734]: I1205 23:35:21.835237 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nc6wd" event={"ID":"608bca6a-1cb5-44b9-91c6-32a77372a4e5","Type":"ContainerStarted","Data":"bf1656a5d0ebcc89d23b99f3182cdea6e497a61a0b1b6e3fbcba07ecbe378e25"} Dec 05 23:35:21 crc kubenswrapper[4734]: I1205 23:35:21.835633 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nc6wd" Dec 05 23:35:21 crc kubenswrapper[4734]: I1205 23:35:21.869867 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-t5gsd" podStartSLOduration=4.142719054 podStartE2EDuration="36.869837272s" podCreationTimestamp="2025-12-05 23:34:45 +0000 UTC" firstStartedPulling="2025-12-05 23:34:47.652710068 +0000 UTC m=+908.336114344" lastFinishedPulling="2025-12-05 23:35:20.379828286 +0000 UTC m=+941.063232562" observedRunningTime="2025-12-05 23:35:21.821792178 +0000 UTC m=+942.505196464" watchObservedRunningTime="2025-12-05 23:35:21.869837272 +0000 UTC m=+942.553241548" Dec 05 23:35:21 crc kubenswrapper[4734]: I1205 23:35:21.910693 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-xwqfj" podStartSLOduration=3.819679313 podStartE2EDuration="35.91066906s" podCreationTimestamp="2025-12-05 23:34:46 +0000 UTC" firstStartedPulling="2025-12-05 23:34:48.275932437 +0000 UTC m=+908.959336713" lastFinishedPulling="2025-12-05 23:35:20.366922184 +0000 UTC m=+941.050326460" observedRunningTime="2025-12-05 23:35:21.872172428 +0000 UTC m=+942.555576704" watchObservedRunningTime="2025-12-05 23:35:21.91066906 +0000 UTC m=+942.594073336" Dec 05 23:35:25 crc kubenswrapper[4734]: I1205 23:35:25.872613 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-r2427" event={"ID":"df5aaec7-4487-47a1-98c4-0206d0ecf7f4","Type":"ContainerStarted","Data":"8dcbc9a0de448539f1119cd49eb31f0386060a90f4c711ad4fb0e7b82871fd51"} Dec 05 23:35:25 crc kubenswrapper[4734]: I1205 23:35:25.873302 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-r2427" event={"ID":"df5aaec7-4487-47a1-98c4-0206d0ecf7f4","Type":"ContainerStarted","Data":"665f1090ef97e5e9f2ed40f83f3d0d2fa001e44c241fa308bbd38a7cb771a9e7"} Dec 05 23:35:25 crc kubenswrapper[4734]: I1205 23:35:25.873371 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-r2427" Dec 05 23:35:25 crc kubenswrapper[4734]: I1205 23:35:25.879357 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fhnqmj" event={"ID":"9cce8abe-4425-4cea-ac4f-3fd707bd5737","Type":"ContainerStarted","Data":"3f575aa5521eb45f7e5d3693b957c10aaa755a8d759e6927ebaca511a635e289"} Dec 05 23:35:25 crc kubenswrapper[4734]: I1205 23:35:25.879390 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fhnqmj" event={"ID":"9cce8abe-4425-4cea-ac4f-3fd707bd5737","Type":"ContainerStarted","Data":"7058d551020de13fdaa817b1520d316c9c49762f37c9e8ceff84d65090bd9a13"} Dec 05 23:35:25 crc kubenswrapper[4734]: I1205 23:35:25.880238 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fhnqmj" Dec 05 23:35:25 crc kubenswrapper[4734]: I1205 23:35:25.913020 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nc6wd" podStartSLOduration=8.387213576 podStartE2EDuration="40.913002791s" podCreationTimestamp="2025-12-05 23:34:45 +0000 UTC" firstStartedPulling="2025-12-05 23:34:47.838994508 +0000 UTC m=+908.522398784" lastFinishedPulling="2025-12-05 23:35:20.364783723 +0000 UTC m=+941.048187999" observedRunningTime="2025-12-05 23:35:21.916788738 +0000 UTC m=+942.600193014" watchObservedRunningTime="2025-12-05 23:35:25.913002791 +0000 UTC m=+946.596407067" Dec 05 23:35:25 crc kubenswrapper[4734]: I1205 23:35:25.997712 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-r2427" podStartSLOduration=30.001685007 podStartE2EDuration="40.997691732s" podCreationTimestamp="2025-12-05 23:34:45 +0000 UTC" firstStartedPulling="2025-12-05 23:35:14.356063085 +0000 UTC m=+935.039467361" lastFinishedPulling="2025-12-05 23:35:25.35206981 +0000 UTC m=+946.035474086" observedRunningTime="2025-12-05 23:35:25.917689195 +0000 UTC m=+946.601093461" watchObservedRunningTime="2025-12-05 23:35:25.997691732 +0000 UTC m=+946.681096008" Dec 05 23:35:25 crc kubenswrapper[4734]: I1205 23:35:25.999407 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fhnqmj" podStartSLOduration=33.682618566 podStartE2EDuration="39.999402613s" podCreationTimestamp="2025-12-05 23:34:46 +0000 UTC" firstStartedPulling="2025-12-05 23:35:19.038267926 +0000 UTC m=+939.721672202" lastFinishedPulling="2025-12-05 23:35:25.355051973 +0000 UTC m=+946.038456249" observedRunningTime="2025-12-05 23:35:25.993255354 +0000 UTC m=+946.676659620" watchObservedRunningTime="2025-12-05 23:35:25.999402613 +0000 UTC m=+946.682806889" Dec 05 23:35:26 crc kubenswrapper[4734]: I1205 23:35:26.146721 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-4clqb" Dec 05 23:35:26 crc kubenswrapper[4734]: I1205 23:35:26.207246 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-rp9j7" Dec 05 23:35:26 crc kubenswrapper[4734]: I1205 23:35:26.373063 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-967d97867-rw8vg" Dec 05 23:35:26 crc kubenswrapper[4734]: I1205 23:35:26.416247 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-t5gsd" Dec 05 23:35:26 crc kubenswrapper[4734]: I1205 23:35:26.483328 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-fwbcd" Dec 05 23:35:26 crc kubenswrapper[4734]: I1205 23:35:26.556812 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-nc6wd" Dec 05 23:35:26 crc kubenswrapper[4734]: I1205 23:35:26.560344 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zx66z" Dec 05 23:35:26 crc kubenswrapper[4734]: I1205 23:35:26.609805 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-wf4vr" Dec 05 23:35:26 crc kubenswrapper[4734]: I1205 23:35:26.698009 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-bf28l" Dec 05 23:35:26 crc kubenswrapper[4734]: I1205 23:35:26.752789 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-hdt6h" Dec 05 23:35:26 crc kubenswrapper[4734]: I1205 23:35:26.769019 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-w8l4b" Dec 05 23:35:26 crc kubenswrapper[4734]: I1205 23:35:26.811143 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-cdtjx" Dec 05 23:35:26 crc kubenswrapper[4734]: I1205 23:35:26.841968 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2fm4z" Dec 05 23:35:26 crc kubenswrapper[4734]: I1205 23:35:26.975898 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-xwqfj" Dec 05 23:35:27 crc kubenswrapper[4734]: I1205 23:35:27.160756 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-j4m2j" Dec 05 23:35:28 crc kubenswrapper[4734]: I1205 23:35:28.143496 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g8vzj"] Dec 05 23:35:28 crc kubenswrapper[4734]: I1205 23:35:28.145593 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g8vzj" Dec 05 23:35:28 crc kubenswrapper[4734]: I1205 23:35:28.170390 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11362b5d-a198-419f-a1d7-d55e27f8fbc7-catalog-content\") pod \"certified-operators-g8vzj\" (UID: \"11362b5d-a198-419f-a1d7-d55e27f8fbc7\") " pod="openshift-marketplace/certified-operators-g8vzj" Dec 05 23:35:28 crc kubenswrapper[4734]: I1205 23:35:28.170443 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct427\" (UniqueName: \"kubernetes.io/projected/11362b5d-a198-419f-a1d7-d55e27f8fbc7-kube-api-access-ct427\") pod \"certified-operators-g8vzj\" (UID: \"11362b5d-a198-419f-a1d7-d55e27f8fbc7\") " pod="openshift-marketplace/certified-operators-g8vzj" Dec 05 23:35:28 crc kubenswrapper[4734]: I1205 23:35:28.170518 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11362b5d-a198-419f-a1d7-d55e27f8fbc7-utilities\") pod \"certified-operators-g8vzj\" (UID: \"11362b5d-a198-419f-a1d7-d55e27f8fbc7\") " pod="openshift-marketplace/certified-operators-g8vzj" Dec 05 23:35:28 crc kubenswrapper[4734]: I1205 23:35:28.174573 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g8vzj"] Dec 05 23:35:28 crc kubenswrapper[4734]: I1205 23:35:28.272180 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11362b5d-a198-419f-a1d7-d55e27f8fbc7-utilities\") pod \"certified-operators-g8vzj\" (UID: \"11362b5d-a198-419f-a1d7-d55e27f8fbc7\") " pod="openshift-marketplace/certified-operators-g8vzj" Dec 05 23:35:28 crc kubenswrapper[4734]: I1205 23:35:28.272322 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11362b5d-a198-419f-a1d7-d55e27f8fbc7-catalog-content\") pod \"certified-operators-g8vzj\" (UID: \"11362b5d-a198-419f-a1d7-d55e27f8fbc7\") " pod="openshift-marketplace/certified-operators-g8vzj" Dec 05 23:35:28 crc kubenswrapper[4734]: I1205 23:35:28.272992 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct427\" (UniqueName: \"kubernetes.io/projected/11362b5d-a198-419f-a1d7-d55e27f8fbc7-kube-api-access-ct427\") pod \"certified-operators-g8vzj\" (UID: \"11362b5d-a198-419f-a1d7-d55e27f8fbc7\") " pod="openshift-marketplace/certified-operators-g8vzj" Dec 05 23:35:28 crc kubenswrapper[4734]: I1205 23:35:28.272835 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11362b5d-a198-419f-a1d7-d55e27f8fbc7-utilities\") pod \"certified-operators-g8vzj\" (UID: \"11362b5d-a198-419f-a1d7-d55e27f8fbc7\") " pod="openshift-marketplace/certified-operators-g8vzj" Dec 05 23:35:28 crc kubenswrapper[4734]: I1205 23:35:28.272940 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11362b5d-a198-419f-a1d7-d55e27f8fbc7-catalog-content\") pod \"certified-operators-g8vzj\" (UID: \"11362b5d-a198-419f-a1d7-d55e27f8fbc7\") " pod="openshift-marketplace/certified-operators-g8vzj" Dec 05 23:35:28 crc kubenswrapper[4734]: I1205 23:35:28.309100 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct427\" (UniqueName: \"kubernetes.io/projected/11362b5d-a198-419f-a1d7-d55e27f8fbc7-kube-api-access-ct427\") pod \"certified-operators-g8vzj\" (UID: \"11362b5d-a198-419f-a1d7-d55e27f8fbc7\") " pod="openshift-marketplace/certified-operators-g8vzj" Dec 05 23:35:28 crc kubenswrapper[4734]: I1205 23:35:28.473074 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g8vzj" Dec 05 23:35:28 crc kubenswrapper[4734]: I1205 23:35:28.625731 4734 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 23:35:28 crc kubenswrapper[4734]: I1205 23:35:28.772868 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g8vzj"] Dec 05 23:35:28 crc kubenswrapper[4734]: W1205 23:35:28.792718 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11362b5d_a198_419f_a1d7_d55e27f8fbc7.slice/crio-54690f3e4398553fa7d49b867f7487c4303a51ba6b0101e09991c3c90ed7f94f WatchSource:0}: Error finding container 54690f3e4398553fa7d49b867f7487c4303a51ba6b0101e09991c3c90ed7f94f: Status 404 returned error can't find the container with id 54690f3e4398553fa7d49b867f7487c4303a51ba6b0101e09991c3c90ed7f94f Dec 05 23:35:28 crc kubenswrapper[4734]: I1205 23:35:28.903342 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8vzj" event={"ID":"11362b5d-a198-419f-a1d7-d55e27f8fbc7","Type":"ContainerStarted","Data":"54690f3e4398553fa7d49b867f7487c4303a51ba6b0101e09991c3c90ed7f94f"} Dec 05 23:35:28 crc kubenswrapper[4734]: I1205 23:35:28.974408 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5845f76896-vhzwq" Dec 05 23:35:32 crc kubenswrapper[4734]: I1205 23:35:32.228890 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-r2427" Dec 05 23:35:36 crc kubenswrapper[4734]: I1205 23:35:36.979774 4734 generic.go:334] "Generic (PLEG): container finished" podID="11362b5d-a198-419f-a1d7-d55e27f8fbc7" containerID="c6e59ca29fadb815680bfd189133226ae42722e648a03591af9388022df51aeb" exitCode=0 Dec 05 23:35:36 crc kubenswrapper[4734]: I1205 23:35:36.979965 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8vzj" event={"ID":"11362b5d-a198-419f-a1d7-d55e27f8fbc7","Type":"ContainerDied","Data":"c6e59ca29fadb815680bfd189133226ae42722e648a03591af9388022df51aeb"} Dec 05 23:35:38 crc kubenswrapper[4734]: I1205 23:35:38.517179 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fhnqmj" Dec 05 23:35:42 crc kubenswrapper[4734]: I1205 23:35:42.036506 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4bt9z" event={"ID":"43c8ec4c-96f9-47f0-9313-2813ea1c62c2","Type":"ContainerStarted","Data":"555b256d16762b8030417a8064d2a3b465c61eb0e821ae9304e25b3241a65e5d"} Dec 05 23:35:42 crc kubenswrapper[4734]: I1205 23:35:42.040048 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8vzj" event={"ID":"11362b5d-a198-419f-a1d7-d55e27f8fbc7","Type":"ContainerStarted","Data":"82bd9b6b3747403687b4cc9047aa849523c11c25277a565d61bd112ebbb665e3"} Dec 05 23:35:42 crc kubenswrapper[4734]: I1205 23:35:42.061138 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4bt9z" podStartSLOduration=2.645719319 podStartE2EDuration="56.061109104s" podCreationTimestamp="2025-12-05 23:34:46 +0000 UTC" firstStartedPulling="2025-12-05 23:34:48.282751422 +0000 UTC m=+908.966155698" lastFinishedPulling="2025-12-05 23:35:41.698141207 +0000 UTC m=+962.381545483" observedRunningTime="2025-12-05 23:35:42.056441742 +0000 UTC m=+962.739846018" watchObservedRunningTime="2025-12-05 23:35:42.061109104 +0000 UTC m=+962.744513380" Dec 05 23:35:43 crc kubenswrapper[4734]: I1205 23:35:43.050147 4734 generic.go:334] "Generic (PLEG): container finished" podID="11362b5d-a198-419f-a1d7-d55e27f8fbc7" containerID="82bd9b6b3747403687b4cc9047aa849523c11c25277a565d61bd112ebbb665e3" exitCode=0 Dec 05 23:35:43 crc kubenswrapper[4734]: I1205 23:35:43.050218 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8vzj" event={"ID":"11362b5d-a198-419f-a1d7-d55e27f8fbc7","Type":"ContainerDied","Data":"82bd9b6b3747403687b4cc9047aa849523c11c25277a565d61bd112ebbb665e3"} Dec 05 23:35:44 crc kubenswrapper[4734]: I1205 23:35:44.073562 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8vzj" event={"ID":"11362b5d-a198-419f-a1d7-d55e27f8fbc7","Type":"ContainerStarted","Data":"75fb1c33bf301f2c21fdb6ab521f4363e8975ac766dabcecd5895e9da9959681"} Dec 05 23:35:44 crc kubenswrapper[4734]: I1205 23:35:44.104465 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g8vzj" podStartSLOduration=10.605331737 podStartE2EDuration="16.104438777s" podCreationTimestamp="2025-12-05 23:35:28 +0000 UTC" firstStartedPulling="2025-12-05 23:35:37.991444023 +0000 UTC m=+958.674848299" lastFinishedPulling="2025-12-05 23:35:43.490551063 +0000 UTC m=+964.173955339" observedRunningTime="2025-12-05 23:35:44.10046187 +0000 UTC m=+964.783866216" watchObservedRunningTime="2025-12-05 23:35:44.104438777 +0000 UTC m=+964.787843053" Dec 05 23:35:48 crc kubenswrapper[4734]: I1205 23:35:48.473586 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g8vzj" Dec 05 23:35:48 crc kubenswrapper[4734]: I1205 23:35:48.474594 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g8vzj" Dec 05 23:35:48 crc kubenswrapper[4734]: I1205 23:35:48.534717 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g8vzj" Dec 05 23:35:49 crc kubenswrapper[4734]: I1205 23:35:49.169669 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g8vzj" Dec 05 23:35:49 crc kubenswrapper[4734]: I1205 23:35:49.250623 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g8vzj"] Dec 05 23:35:50 crc kubenswrapper[4734]: I1205 23:35:50.445095 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:35:50 crc kubenswrapper[4734]: I1205 23:35:50.445182 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:35:51 crc kubenswrapper[4734]: I1205 23:35:51.133117 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-g8vzj" podUID="11362b5d-a198-419f-a1d7-d55e27f8fbc7" containerName="registry-server" containerID="cri-o://75fb1c33bf301f2c21fdb6ab521f4363e8975ac766dabcecd5895e9da9959681" gracePeriod=2 Dec 05 23:35:52 crc kubenswrapper[4734]: I1205 23:35:52.098722 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g8vzj" Dec 05 23:35:52 crc kubenswrapper[4734]: I1205 23:35:52.149057 4734 generic.go:334] "Generic (PLEG): container finished" podID="11362b5d-a198-419f-a1d7-d55e27f8fbc7" containerID="75fb1c33bf301f2c21fdb6ab521f4363e8975ac766dabcecd5895e9da9959681" exitCode=0 Dec 05 23:35:52 crc kubenswrapper[4734]: I1205 23:35:52.149133 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8vzj" event={"ID":"11362b5d-a198-419f-a1d7-d55e27f8fbc7","Type":"ContainerDied","Data":"75fb1c33bf301f2c21fdb6ab521f4363e8975ac766dabcecd5895e9da9959681"} Dec 05 23:35:52 crc kubenswrapper[4734]: I1205 23:35:52.149170 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8vzj" event={"ID":"11362b5d-a198-419f-a1d7-d55e27f8fbc7","Type":"ContainerDied","Data":"54690f3e4398553fa7d49b867f7487c4303a51ba6b0101e09991c3c90ed7f94f"} Dec 05 23:35:52 crc kubenswrapper[4734]: I1205 23:35:52.149209 4734 scope.go:117] "RemoveContainer" containerID="75fb1c33bf301f2c21fdb6ab521f4363e8975ac766dabcecd5895e9da9959681" Dec 05 23:35:52 crc kubenswrapper[4734]: I1205 23:35:52.149465 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g8vzj" Dec 05 23:35:52 crc kubenswrapper[4734]: I1205 23:35:52.171409 4734 scope.go:117] "RemoveContainer" containerID="82bd9b6b3747403687b4cc9047aa849523c11c25277a565d61bd112ebbb665e3" Dec 05 23:35:52 crc kubenswrapper[4734]: I1205 23:35:52.198431 4734 scope.go:117] "RemoveContainer" containerID="c6e59ca29fadb815680bfd189133226ae42722e648a03591af9388022df51aeb" Dec 05 23:35:52 crc kubenswrapper[4734]: I1205 23:35:52.227608 4734 scope.go:117] "RemoveContainer" containerID="75fb1c33bf301f2c21fdb6ab521f4363e8975ac766dabcecd5895e9da9959681" Dec 05 23:35:52 crc kubenswrapper[4734]: E1205 23:35:52.230768 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75fb1c33bf301f2c21fdb6ab521f4363e8975ac766dabcecd5895e9da9959681\": container with ID starting with 75fb1c33bf301f2c21fdb6ab521f4363e8975ac766dabcecd5895e9da9959681 not found: ID does not exist" containerID="75fb1c33bf301f2c21fdb6ab521f4363e8975ac766dabcecd5895e9da9959681" Dec 05 23:35:52 crc kubenswrapper[4734]: I1205 23:35:52.230834 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75fb1c33bf301f2c21fdb6ab521f4363e8975ac766dabcecd5895e9da9959681"} err="failed to get container status \"75fb1c33bf301f2c21fdb6ab521f4363e8975ac766dabcecd5895e9da9959681\": rpc error: code = NotFound desc = could not find container \"75fb1c33bf301f2c21fdb6ab521f4363e8975ac766dabcecd5895e9da9959681\": container with ID starting with 75fb1c33bf301f2c21fdb6ab521f4363e8975ac766dabcecd5895e9da9959681 not found: ID does not exist" Dec 05 23:35:52 crc kubenswrapper[4734]: I1205 23:35:52.230869 4734 scope.go:117] "RemoveContainer" containerID="82bd9b6b3747403687b4cc9047aa849523c11c25277a565d61bd112ebbb665e3" Dec 05 23:35:52 crc kubenswrapper[4734]: E1205 23:35:52.231553 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82bd9b6b3747403687b4cc9047aa849523c11c25277a565d61bd112ebbb665e3\": container with ID starting with 82bd9b6b3747403687b4cc9047aa849523c11c25277a565d61bd112ebbb665e3 not found: ID does not exist" containerID="82bd9b6b3747403687b4cc9047aa849523c11c25277a565d61bd112ebbb665e3" Dec 05 23:35:52 crc kubenswrapper[4734]: I1205 23:35:52.231610 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82bd9b6b3747403687b4cc9047aa849523c11c25277a565d61bd112ebbb665e3"} err="failed to get container status \"82bd9b6b3747403687b4cc9047aa849523c11c25277a565d61bd112ebbb665e3\": rpc error: code = NotFound desc = could not find container \"82bd9b6b3747403687b4cc9047aa849523c11c25277a565d61bd112ebbb665e3\": container with ID starting with 82bd9b6b3747403687b4cc9047aa849523c11c25277a565d61bd112ebbb665e3 not found: ID does not exist" Dec 05 23:35:52 crc kubenswrapper[4734]: I1205 23:35:52.231648 4734 scope.go:117] "RemoveContainer" containerID="c6e59ca29fadb815680bfd189133226ae42722e648a03591af9388022df51aeb" Dec 05 23:35:52 crc kubenswrapper[4734]: E1205 23:35:52.232073 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6e59ca29fadb815680bfd189133226ae42722e648a03591af9388022df51aeb\": container with ID starting with c6e59ca29fadb815680bfd189133226ae42722e648a03591af9388022df51aeb not found: ID does not exist" containerID="c6e59ca29fadb815680bfd189133226ae42722e648a03591af9388022df51aeb" Dec 05 23:35:52 crc kubenswrapper[4734]: I1205 23:35:52.232125 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6e59ca29fadb815680bfd189133226ae42722e648a03591af9388022df51aeb"} err="failed to get container status \"c6e59ca29fadb815680bfd189133226ae42722e648a03591af9388022df51aeb\": rpc error: code = NotFound desc = could not find container \"c6e59ca29fadb815680bfd189133226ae42722e648a03591af9388022df51aeb\": container with ID starting with c6e59ca29fadb815680bfd189133226ae42722e648a03591af9388022df51aeb not found: ID does not exist" Dec 05 23:35:52 crc kubenswrapper[4734]: I1205 23:35:52.274875 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11362b5d-a198-419f-a1d7-d55e27f8fbc7-utilities\") pod \"11362b5d-a198-419f-a1d7-d55e27f8fbc7\" (UID: \"11362b5d-a198-419f-a1d7-d55e27f8fbc7\") " Dec 05 23:35:52 crc kubenswrapper[4734]: I1205 23:35:52.274970 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct427\" (UniqueName: \"kubernetes.io/projected/11362b5d-a198-419f-a1d7-d55e27f8fbc7-kube-api-access-ct427\") pod \"11362b5d-a198-419f-a1d7-d55e27f8fbc7\" (UID: \"11362b5d-a198-419f-a1d7-d55e27f8fbc7\") " Dec 05 23:35:52 crc kubenswrapper[4734]: I1205 23:35:52.275102 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11362b5d-a198-419f-a1d7-d55e27f8fbc7-catalog-content\") pod \"11362b5d-a198-419f-a1d7-d55e27f8fbc7\" (UID: \"11362b5d-a198-419f-a1d7-d55e27f8fbc7\") " Dec 05 23:35:52 crc kubenswrapper[4734]: I1205 23:35:52.276214 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11362b5d-a198-419f-a1d7-d55e27f8fbc7-utilities" (OuterVolumeSpecName: "utilities") pod "11362b5d-a198-419f-a1d7-d55e27f8fbc7" (UID: "11362b5d-a198-419f-a1d7-d55e27f8fbc7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:35:52 crc kubenswrapper[4734]: I1205 23:35:52.284446 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11362b5d-a198-419f-a1d7-d55e27f8fbc7-kube-api-access-ct427" (OuterVolumeSpecName: "kube-api-access-ct427") pod "11362b5d-a198-419f-a1d7-d55e27f8fbc7" (UID: "11362b5d-a198-419f-a1d7-d55e27f8fbc7"). InnerVolumeSpecName "kube-api-access-ct427". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:35:52 crc kubenswrapper[4734]: I1205 23:35:52.377502 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11362b5d-a198-419f-a1d7-d55e27f8fbc7-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:35:52 crc kubenswrapper[4734]: I1205 23:35:52.377584 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct427\" (UniqueName: \"kubernetes.io/projected/11362b5d-a198-419f-a1d7-d55e27f8fbc7-kube-api-access-ct427\") on node \"crc\" DevicePath \"\"" Dec 05 23:35:52 crc kubenswrapper[4734]: I1205 23:35:52.652490 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11362b5d-a198-419f-a1d7-d55e27f8fbc7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11362b5d-a198-419f-a1d7-d55e27f8fbc7" (UID: "11362b5d-a198-419f-a1d7-d55e27f8fbc7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:35:52 crc kubenswrapper[4734]: I1205 23:35:52.686263 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11362b5d-a198-419f-a1d7-d55e27f8fbc7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:35:52 crc kubenswrapper[4734]: I1205 23:35:52.783418 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g8vzj"] Dec 05 23:35:52 crc kubenswrapper[4734]: I1205 23:35:52.800032 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-g8vzj"] Dec 05 23:35:53 crc kubenswrapper[4734]: I1205 23:35:53.633928 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11362b5d-a198-419f-a1d7-d55e27f8fbc7" path="/var/lib/kubelet/pods/11362b5d-a198-419f-a1d7-d55e27f8fbc7/volumes" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.148560 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fvnv4"] Dec 05 23:36:06 crc kubenswrapper[4734]: E1205 23:36:06.149825 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11362b5d-a198-419f-a1d7-d55e27f8fbc7" containerName="extract-utilities" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.149846 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="11362b5d-a198-419f-a1d7-d55e27f8fbc7" containerName="extract-utilities" Dec 05 23:36:06 crc kubenswrapper[4734]: E1205 23:36:06.149871 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11362b5d-a198-419f-a1d7-d55e27f8fbc7" containerName="registry-server" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.149880 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="11362b5d-a198-419f-a1d7-d55e27f8fbc7" containerName="registry-server" Dec 05 23:36:06 crc kubenswrapper[4734]: E1205 23:36:06.149908 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11362b5d-a198-419f-a1d7-d55e27f8fbc7" containerName="extract-content" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.149915 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="11362b5d-a198-419f-a1d7-d55e27f8fbc7" containerName="extract-content" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.151755 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="11362b5d-a198-419f-a1d7-d55e27f8fbc7" containerName="registry-server" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.153071 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fvnv4" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.155310 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-ck94g" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.156307 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.162930 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.162991 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.182149 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fvnv4"] Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.209187 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zpj4b"] Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.211938 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zpj4b" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.216766 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.225126 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zpj4b"] Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.313243 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a30f19aa-415d-4608-ac2a-7d52751225d7-config\") pod \"dnsmasq-dns-675f4bcbfc-fvnv4\" (UID: \"a30f19aa-415d-4608-ac2a-7d52751225d7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fvnv4" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.313811 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96d53662-9e9d-4205-9a4b-23eea707f724-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-zpj4b\" (UID: \"96d53662-9e9d-4205-9a4b-23eea707f724\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zpj4b" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.313918 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbz4p\" (UniqueName: \"kubernetes.io/projected/96d53662-9e9d-4205-9a4b-23eea707f724-kube-api-access-lbz4p\") pod \"dnsmasq-dns-78dd6ddcc-zpj4b\" (UID: \"96d53662-9e9d-4205-9a4b-23eea707f724\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zpj4b" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.313992 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xddp\" (UniqueName: \"kubernetes.io/projected/a30f19aa-415d-4608-ac2a-7d52751225d7-kube-api-access-2xddp\") pod \"dnsmasq-dns-675f4bcbfc-fvnv4\" (UID: \"a30f19aa-415d-4608-ac2a-7d52751225d7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fvnv4" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.314086 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96d53662-9e9d-4205-9a4b-23eea707f724-config\") pod \"dnsmasq-dns-78dd6ddcc-zpj4b\" (UID: \"96d53662-9e9d-4205-9a4b-23eea707f724\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zpj4b" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.415930 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbz4p\" (UniqueName: \"kubernetes.io/projected/96d53662-9e9d-4205-9a4b-23eea707f724-kube-api-access-lbz4p\") pod \"dnsmasq-dns-78dd6ddcc-zpj4b\" (UID: \"96d53662-9e9d-4205-9a4b-23eea707f724\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zpj4b" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.415987 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xddp\" (UniqueName: \"kubernetes.io/projected/a30f19aa-415d-4608-ac2a-7d52751225d7-kube-api-access-2xddp\") pod \"dnsmasq-dns-675f4bcbfc-fvnv4\" (UID: \"a30f19aa-415d-4608-ac2a-7d52751225d7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fvnv4" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.416044 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96d53662-9e9d-4205-9a4b-23eea707f724-config\") pod \"dnsmasq-dns-78dd6ddcc-zpj4b\" (UID: \"96d53662-9e9d-4205-9a4b-23eea707f724\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zpj4b" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.416079 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a30f19aa-415d-4608-ac2a-7d52751225d7-config\") pod \"dnsmasq-dns-675f4bcbfc-fvnv4\" (UID: \"a30f19aa-415d-4608-ac2a-7d52751225d7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fvnv4" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.416116 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96d53662-9e9d-4205-9a4b-23eea707f724-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-zpj4b\" (UID: \"96d53662-9e9d-4205-9a4b-23eea707f724\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zpj4b" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.417203 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a30f19aa-415d-4608-ac2a-7d52751225d7-config\") pod \"dnsmasq-dns-675f4bcbfc-fvnv4\" (UID: \"a30f19aa-415d-4608-ac2a-7d52751225d7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fvnv4" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.417203 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96d53662-9e9d-4205-9a4b-23eea707f724-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-zpj4b\" (UID: \"96d53662-9e9d-4205-9a4b-23eea707f724\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zpj4b" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.417306 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96d53662-9e9d-4205-9a4b-23eea707f724-config\") pod \"dnsmasq-dns-78dd6ddcc-zpj4b\" (UID: \"96d53662-9e9d-4205-9a4b-23eea707f724\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zpj4b" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.436578 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbz4p\" (UniqueName: \"kubernetes.io/projected/96d53662-9e9d-4205-9a4b-23eea707f724-kube-api-access-lbz4p\") pod \"dnsmasq-dns-78dd6ddcc-zpj4b\" (UID: \"96d53662-9e9d-4205-9a4b-23eea707f724\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zpj4b" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.436893 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xddp\" (UniqueName: \"kubernetes.io/projected/a30f19aa-415d-4608-ac2a-7d52751225d7-kube-api-access-2xddp\") pod \"dnsmasq-dns-675f4bcbfc-fvnv4\" (UID: \"a30f19aa-415d-4608-ac2a-7d52751225d7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fvnv4" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.480560 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fvnv4" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.556071 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zpj4b" Dec 05 23:36:06 crc kubenswrapper[4734]: I1205 23:36:06.941960 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fvnv4"] Dec 05 23:36:06 crc kubenswrapper[4734]: W1205 23:36:06.953214 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda30f19aa_415d_4608_ac2a_7d52751225d7.slice/crio-b89b92adec2f93d31ca261d54c0fd9e431dca8b8e749b4371bb076223e2d87ce WatchSource:0}: Error finding container b89b92adec2f93d31ca261d54c0fd9e431dca8b8e749b4371bb076223e2d87ce: Status 404 returned error can't find the container with id b89b92adec2f93d31ca261d54c0fd9e431dca8b8e749b4371bb076223e2d87ce Dec 05 23:36:07 crc kubenswrapper[4734]: I1205 23:36:07.048665 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zpj4b"] Dec 05 23:36:07 crc kubenswrapper[4734]: I1205 23:36:07.274858 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-zpj4b" event={"ID":"96d53662-9e9d-4205-9a4b-23eea707f724","Type":"ContainerStarted","Data":"4cfa82277d99d4ded5415167d1ca5acad1682b9d6a1ae46bb6f755b939e35c30"} Dec 05 23:36:07 crc kubenswrapper[4734]: I1205 23:36:07.276367 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-fvnv4" event={"ID":"a30f19aa-415d-4608-ac2a-7d52751225d7","Type":"ContainerStarted","Data":"b89b92adec2f93d31ca261d54c0fd9e431dca8b8e749b4371bb076223e2d87ce"} Dec 05 23:36:09 crc kubenswrapper[4734]: I1205 23:36:09.565674 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fvnv4"] Dec 05 23:36:09 crc kubenswrapper[4734]: I1205 23:36:09.599156 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-rb2ss"] Dec 05 23:36:09 crc kubenswrapper[4734]: I1205 23:36:09.602248 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-rb2ss" Dec 05 23:36:09 crc kubenswrapper[4734]: I1205 23:36:09.650006 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-rb2ss"] Dec 05 23:36:09 crc kubenswrapper[4734]: I1205 23:36:09.776661 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1826ecde-f68b-4010-bb28-aab705498c88-config\") pod \"dnsmasq-dns-666b6646f7-rb2ss\" (UID: \"1826ecde-f68b-4010-bb28-aab705498c88\") " pod="openstack/dnsmasq-dns-666b6646f7-rb2ss" Dec 05 23:36:09 crc kubenswrapper[4734]: I1205 23:36:09.776729 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptd7t\" (UniqueName: \"kubernetes.io/projected/1826ecde-f68b-4010-bb28-aab705498c88-kube-api-access-ptd7t\") pod \"dnsmasq-dns-666b6646f7-rb2ss\" (UID: \"1826ecde-f68b-4010-bb28-aab705498c88\") " pod="openstack/dnsmasq-dns-666b6646f7-rb2ss" Dec 05 23:36:09 crc kubenswrapper[4734]: I1205 23:36:09.776777 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1826ecde-f68b-4010-bb28-aab705498c88-dns-svc\") pod \"dnsmasq-dns-666b6646f7-rb2ss\" (UID: \"1826ecde-f68b-4010-bb28-aab705498c88\") " pod="openstack/dnsmasq-dns-666b6646f7-rb2ss" Dec 05 23:36:09 crc kubenswrapper[4734]: I1205 23:36:09.882432 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1826ecde-f68b-4010-bb28-aab705498c88-config\") pod \"dnsmasq-dns-666b6646f7-rb2ss\" (UID: \"1826ecde-f68b-4010-bb28-aab705498c88\") " pod="openstack/dnsmasq-dns-666b6646f7-rb2ss" Dec 05 23:36:09 crc kubenswrapper[4734]: I1205 23:36:09.882500 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptd7t\" (UniqueName: \"kubernetes.io/projected/1826ecde-f68b-4010-bb28-aab705498c88-kube-api-access-ptd7t\") pod \"dnsmasq-dns-666b6646f7-rb2ss\" (UID: \"1826ecde-f68b-4010-bb28-aab705498c88\") " pod="openstack/dnsmasq-dns-666b6646f7-rb2ss" Dec 05 23:36:09 crc kubenswrapper[4734]: I1205 23:36:09.882549 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1826ecde-f68b-4010-bb28-aab705498c88-dns-svc\") pod \"dnsmasq-dns-666b6646f7-rb2ss\" (UID: \"1826ecde-f68b-4010-bb28-aab705498c88\") " pod="openstack/dnsmasq-dns-666b6646f7-rb2ss" Dec 05 23:36:09 crc kubenswrapper[4734]: I1205 23:36:09.883672 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1826ecde-f68b-4010-bb28-aab705498c88-dns-svc\") pod \"dnsmasq-dns-666b6646f7-rb2ss\" (UID: \"1826ecde-f68b-4010-bb28-aab705498c88\") " pod="openstack/dnsmasq-dns-666b6646f7-rb2ss" Dec 05 23:36:09 crc kubenswrapper[4734]: I1205 23:36:09.884223 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1826ecde-f68b-4010-bb28-aab705498c88-config\") pod \"dnsmasq-dns-666b6646f7-rb2ss\" (UID: \"1826ecde-f68b-4010-bb28-aab705498c88\") " pod="openstack/dnsmasq-dns-666b6646f7-rb2ss" Dec 05 23:36:09 crc kubenswrapper[4734]: I1205 23:36:09.914613 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zpj4b"] Dec 05 23:36:09 crc kubenswrapper[4734]: I1205 23:36:09.916438 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptd7t\" (UniqueName: \"kubernetes.io/projected/1826ecde-f68b-4010-bb28-aab705498c88-kube-api-access-ptd7t\") pod \"dnsmasq-dns-666b6646f7-rb2ss\" (UID: \"1826ecde-f68b-4010-bb28-aab705498c88\") " pod="openstack/dnsmasq-dns-666b6646f7-rb2ss" Dec 05 23:36:09 crc kubenswrapper[4734]: I1205 23:36:09.936596 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-f684s"] Dec 05 23:36:09 crc kubenswrapper[4734]: I1205 23:36:09.939686 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-rb2ss" Dec 05 23:36:09 crc kubenswrapper[4734]: I1205 23:36:09.939883 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-f684s" Dec 05 23:36:09 crc kubenswrapper[4734]: I1205 23:36:09.957722 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-f684s"] Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.104976 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/355a394a-ee81-463e-8d82-b6c789ad6361-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-f684s\" (UID: \"355a394a-ee81-463e-8d82-b6c789ad6361\") " pod="openstack/dnsmasq-dns-57d769cc4f-f684s" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.106425 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/355a394a-ee81-463e-8d82-b6c789ad6361-config\") pod \"dnsmasq-dns-57d769cc4f-f684s\" (UID: \"355a394a-ee81-463e-8d82-b6c789ad6361\") " pod="openstack/dnsmasq-dns-57d769cc4f-f684s" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.106462 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-878pv\" (UniqueName: \"kubernetes.io/projected/355a394a-ee81-463e-8d82-b6c789ad6361-kube-api-access-878pv\") pod \"dnsmasq-dns-57d769cc4f-f684s\" (UID: \"355a394a-ee81-463e-8d82-b6c789ad6361\") " pod="openstack/dnsmasq-dns-57d769cc4f-f684s" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.211093 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/355a394a-ee81-463e-8d82-b6c789ad6361-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-f684s\" (UID: \"355a394a-ee81-463e-8d82-b6c789ad6361\") " pod="openstack/dnsmasq-dns-57d769cc4f-f684s" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.211203 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/355a394a-ee81-463e-8d82-b6c789ad6361-config\") pod \"dnsmasq-dns-57d769cc4f-f684s\" (UID: \"355a394a-ee81-463e-8d82-b6c789ad6361\") " pod="openstack/dnsmasq-dns-57d769cc4f-f684s" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.211223 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-878pv\" (UniqueName: \"kubernetes.io/projected/355a394a-ee81-463e-8d82-b6c789ad6361-kube-api-access-878pv\") pod \"dnsmasq-dns-57d769cc4f-f684s\" (UID: \"355a394a-ee81-463e-8d82-b6c789ad6361\") " pod="openstack/dnsmasq-dns-57d769cc4f-f684s" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.212617 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/355a394a-ee81-463e-8d82-b6c789ad6361-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-f684s\" (UID: \"355a394a-ee81-463e-8d82-b6c789ad6361\") " pod="openstack/dnsmasq-dns-57d769cc4f-f684s" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.215174 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/355a394a-ee81-463e-8d82-b6c789ad6361-config\") pod \"dnsmasq-dns-57d769cc4f-f684s\" (UID: \"355a394a-ee81-463e-8d82-b6c789ad6361\") " pod="openstack/dnsmasq-dns-57d769cc4f-f684s" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.245681 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-878pv\" (UniqueName: \"kubernetes.io/projected/355a394a-ee81-463e-8d82-b6c789ad6361-kube-api-access-878pv\") pod \"dnsmasq-dns-57d769cc4f-f684s\" (UID: \"355a394a-ee81-463e-8d82-b6c789ad6361\") " pod="openstack/dnsmasq-dns-57d769cc4f-f684s" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.277143 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-f684s" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.752644 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.754615 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.758057 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.758386 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.758422 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-7thdb" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.758795 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.758935 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.759009 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.763629 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.772364 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.822798 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c35eaa12-d993-4769-975b-35a5ac6609e0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.822848 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c35eaa12-d993-4769-975b-35a5ac6609e0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.822879 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c35eaa12-d993-4769-975b-35a5ac6609e0-config-data\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.822898 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c35eaa12-d993-4769-975b-35a5ac6609e0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.822967 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c35eaa12-d993-4769-975b-35a5ac6609e0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.822987 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.823002 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c35eaa12-d993-4769-975b-35a5ac6609e0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.823063 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvnnn\" (UniqueName: \"kubernetes.io/projected/c35eaa12-d993-4769-975b-35a5ac6609e0-kube-api-access-rvnnn\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.823087 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c35eaa12-d993-4769-975b-35a5ac6609e0-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.823128 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c35eaa12-d993-4769-975b-35a5ac6609e0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.823159 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c35eaa12-d993-4769-975b-35a5ac6609e0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.925297 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvnnn\" (UniqueName: \"kubernetes.io/projected/c35eaa12-d993-4769-975b-35a5ac6609e0-kube-api-access-rvnnn\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.925385 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c35eaa12-d993-4769-975b-35a5ac6609e0-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.925454 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c35eaa12-d993-4769-975b-35a5ac6609e0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.925516 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c35eaa12-d993-4769-975b-35a5ac6609e0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.925587 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c35eaa12-d993-4769-975b-35a5ac6609e0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.925640 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c35eaa12-d993-4769-975b-35a5ac6609e0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.925679 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c35eaa12-d993-4769-975b-35a5ac6609e0-config-data\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.925730 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c35eaa12-d993-4769-975b-35a5ac6609e0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.925757 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c35eaa12-d993-4769-975b-35a5ac6609e0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.925806 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.925835 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c35eaa12-d993-4769-975b-35a5ac6609e0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.926683 4734 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.927093 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c35eaa12-d993-4769-975b-35a5ac6609e0-config-data\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.927456 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c35eaa12-d993-4769-975b-35a5ac6609e0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.927476 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c35eaa12-d993-4769-975b-35a5ac6609e0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.927588 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c35eaa12-d993-4769-975b-35a5ac6609e0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.933581 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c35eaa12-d993-4769-975b-35a5ac6609e0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.942728 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c35eaa12-d993-4769-975b-35a5ac6609e0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.944770 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c35eaa12-d993-4769-975b-35a5ac6609e0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.946363 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c35eaa12-d993-4769-975b-35a5ac6609e0-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.946576 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c35eaa12-d993-4769-975b-35a5ac6609e0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.949601 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvnnn\" (UniqueName: \"kubernetes.io/projected/c35eaa12-d993-4769-975b-35a5ac6609e0-kube-api-access-rvnnn\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:10 crc kubenswrapper[4734]: I1205 23:36:10.953070 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " pod="openstack/rabbitmq-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.008380 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-rb2ss"] Dec 05 23:36:11 crc kubenswrapper[4734]: W1205 23:36:11.022591 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1826ecde_f68b_4010_bb28_aab705498c88.slice/crio-c8011d7672c5f37d2040665f0ae00fd061580906bebb4e189a1635ea5c0ee67c WatchSource:0}: Error finding container c8011d7672c5f37d2040665f0ae00fd061580906bebb4e189a1635ea5c0ee67c: Status 404 returned error can't find the container with id c8011d7672c5f37d2040665f0ae00fd061580906bebb4e189a1635ea5c0ee67c Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.035391 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-f684s"] Dec 05 23:36:11 crc kubenswrapper[4734]: W1205 23:36:11.048560 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod355a394a_ee81_463e_8d82_b6c789ad6361.slice/crio-9e01e898ad702db21ef593d30af98eb1d8ca97f049e0cf2fc8903d94851549bf WatchSource:0}: Error finding container 9e01e898ad702db21ef593d30af98eb1d8ca97f049e0cf2fc8903d94851549bf: Status 404 returned error can't find the container with id 9e01e898ad702db21ef593d30af98eb1d8ca97f049e0cf2fc8903d94851549bf Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.079483 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.081322 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.083311 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.088389 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.088618 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.089822 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-ncglt" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.089850 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.090833 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.091034 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.092193 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.100586 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.130621 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ed95027c-1ded-4127-a341-7ee81018d4b6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.130691 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ed95027c-1ded-4127-a341-7ee81018d4b6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.131161 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ed95027c-1ded-4127-a341-7ee81018d4b6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.131270 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.131384 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c69bp\" (UniqueName: \"kubernetes.io/projected/ed95027c-1ded-4127-a341-7ee81018d4b6-kube-api-access-c69bp\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.131492 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ed95027c-1ded-4127-a341-7ee81018d4b6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.131553 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed95027c-1ded-4127-a341-7ee81018d4b6-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.131585 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ed95027c-1ded-4127-a341-7ee81018d4b6-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.131620 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ed95027c-1ded-4127-a341-7ee81018d4b6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.131703 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ed95027c-1ded-4127-a341-7ee81018d4b6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.131740 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ed95027c-1ded-4127-a341-7ee81018d4b6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.234087 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.234693 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c69bp\" (UniqueName: \"kubernetes.io/projected/ed95027c-1ded-4127-a341-7ee81018d4b6-kube-api-access-c69bp\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.234760 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed95027c-1ded-4127-a341-7ee81018d4b6-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.234789 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ed95027c-1ded-4127-a341-7ee81018d4b6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.234820 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ed95027c-1ded-4127-a341-7ee81018d4b6-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.234855 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ed95027c-1ded-4127-a341-7ee81018d4b6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.234861 4734 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.236380 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed95027c-1ded-4127-a341-7ee81018d4b6-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.236642 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ed95027c-1ded-4127-a341-7ee81018d4b6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.236681 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ed95027c-1ded-4127-a341-7ee81018d4b6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.236859 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ed95027c-1ded-4127-a341-7ee81018d4b6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.236974 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ed95027c-1ded-4127-a341-7ee81018d4b6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.237068 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ed95027c-1ded-4127-a341-7ee81018d4b6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.237142 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ed95027c-1ded-4127-a341-7ee81018d4b6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.238083 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ed95027c-1ded-4127-a341-7ee81018d4b6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.238442 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ed95027c-1ded-4127-a341-7ee81018d4b6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.238807 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ed95027c-1ded-4127-a341-7ee81018d4b6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.242419 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ed95027c-1ded-4127-a341-7ee81018d4b6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.242980 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ed95027c-1ded-4127-a341-7ee81018d4b6-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.249863 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ed95027c-1ded-4127-a341-7ee81018d4b6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.258028 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ed95027c-1ded-4127-a341-7ee81018d4b6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.269385 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c69bp\" (UniqueName: \"kubernetes.io/projected/ed95027c-1ded-4127-a341-7ee81018d4b6-kube-api-access-c69bp\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.275784 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.363615 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-rb2ss" event={"ID":"1826ecde-f68b-4010-bb28-aab705498c88","Type":"ContainerStarted","Data":"c8011d7672c5f37d2040665f0ae00fd061580906bebb4e189a1635ea5c0ee67c"} Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.369907 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-f684s" event={"ID":"355a394a-ee81-463e-8d82-b6c789ad6361","Type":"ContainerStarted","Data":"9e01e898ad702db21ef593d30af98eb1d8ca97f049e0cf2fc8903d94851549bf"} Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.447093 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:36:11 crc kubenswrapper[4734]: I1205 23:36:11.749858 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.191371 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.369312 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.372028 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.377470 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-wcpgg" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.379093 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.379445 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.379670 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.388181 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.401201 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.497030 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c35eaa12-d993-4769-975b-35a5ac6609e0","Type":"ContainerStarted","Data":"5463b83dd1e6a729a45f7c14d4b26f8d563230a8dbeb04f3da73d157f40ffe7c"} Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.505860 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ed95027c-1ded-4127-a341-7ee81018d4b6","Type":"ContainerStarted","Data":"e2856be192207e70f4ec1dd3befd5a54fb82ddf659138d1d1d0d6852a043dfce"} Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.538350 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9fd725c7-f12a-4504-a71d-46e7d0258af7-config-data-default\") pod \"openstack-galera-0\" (UID: \"9fd725c7-f12a-4504-a71d-46e7d0258af7\") " pod="openstack/openstack-galera-0" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.538415 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"9fd725c7-f12a-4504-a71d-46e7d0258af7\") " pod="openstack/openstack-galera-0" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.538438 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fd725c7-f12a-4504-a71d-46e7d0258af7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9fd725c7-f12a-4504-a71d-46e7d0258af7\") " pod="openstack/openstack-galera-0" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.538466 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fd725c7-f12a-4504-a71d-46e7d0258af7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9fd725c7-f12a-4504-a71d-46e7d0258af7\") " pod="openstack/openstack-galera-0" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.538492 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcf65\" (UniqueName: \"kubernetes.io/projected/9fd725c7-f12a-4504-a71d-46e7d0258af7-kube-api-access-dcf65\") pod \"openstack-galera-0\" (UID: \"9fd725c7-f12a-4504-a71d-46e7d0258af7\") " pod="openstack/openstack-galera-0" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.538517 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9fd725c7-f12a-4504-a71d-46e7d0258af7-kolla-config\") pod \"openstack-galera-0\" (UID: \"9fd725c7-f12a-4504-a71d-46e7d0258af7\") " pod="openstack/openstack-galera-0" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.538559 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9fd725c7-f12a-4504-a71d-46e7d0258af7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9fd725c7-f12a-4504-a71d-46e7d0258af7\") " pod="openstack/openstack-galera-0" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.538594 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fd725c7-f12a-4504-a71d-46e7d0258af7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9fd725c7-f12a-4504-a71d-46e7d0258af7\") " pod="openstack/openstack-galera-0" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.640294 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fd725c7-f12a-4504-a71d-46e7d0258af7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9fd725c7-f12a-4504-a71d-46e7d0258af7\") " pod="openstack/openstack-galera-0" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.640358 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcf65\" (UniqueName: \"kubernetes.io/projected/9fd725c7-f12a-4504-a71d-46e7d0258af7-kube-api-access-dcf65\") pod \"openstack-galera-0\" (UID: \"9fd725c7-f12a-4504-a71d-46e7d0258af7\") " pod="openstack/openstack-galera-0" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.640416 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9fd725c7-f12a-4504-a71d-46e7d0258af7-kolla-config\") pod \"openstack-galera-0\" (UID: \"9fd725c7-f12a-4504-a71d-46e7d0258af7\") " pod="openstack/openstack-galera-0" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.640444 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9fd725c7-f12a-4504-a71d-46e7d0258af7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9fd725c7-f12a-4504-a71d-46e7d0258af7\") " pod="openstack/openstack-galera-0" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.640961 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fd725c7-f12a-4504-a71d-46e7d0258af7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9fd725c7-f12a-4504-a71d-46e7d0258af7\") " pod="openstack/openstack-galera-0" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.641197 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9fd725c7-f12a-4504-a71d-46e7d0258af7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9fd725c7-f12a-4504-a71d-46e7d0258af7\") " pod="openstack/openstack-galera-0" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.641387 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9fd725c7-f12a-4504-a71d-46e7d0258af7-kolla-config\") pod \"openstack-galera-0\" (UID: \"9fd725c7-f12a-4504-a71d-46e7d0258af7\") " pod="openstack/openstack-galera-0" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.641645 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9fd725c7-f12a-4504-a71d-46e7d0258af7-config-data-default\") pod \"openstack-galera-0\" (UID: \"9fd725c7-f12a-4504-a71d-46e7d0258af7\") " pod="openstack/openstack-galera-0" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.641713 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"9fd725c7-f12a-4504-a71d-46e7d0258af7\") " pod="openstack/openstack-galera-0" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.641752 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fd725c7-f12a-4504-a71d-46e7d0258af7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9fd725c7-f12a-4504-a71d-46e7d0258af7\") " pod="openstack/openstack-galera-0" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.642328 4734 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"9fd725c7-f12a-4504-a71d-46e7d0258af7\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-galera-0" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.642379 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9fd725c7-f12a-4504-a71d-46e7d0258af7-config-data-default\") pod \"openstack-galera-0\" (UID: \"9fd725c7-f12a-4504-a71d-46e7d0258af7\") " pod="openstack/openstack-galera-0" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.643632 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fd725c7-f12a-4504-a71d-46e7d0258af7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9fd725c7-f12a-4504-a71d-46e7d0258af7\") " pod="openstack/openstack-galera-0" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.650307 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fd725c7-f12a-4504-a71d-46e7d0258af7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9fd725c7-f12a-4504-a71d-46e7d0258af7\") " pod="openstack/openstack-galera-0" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.661065 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fd725c7-f12a-4504-a71d-46e7d0258af7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9fd725c7-f12a-4504-a71d-46e7d0258af7\") " pod="openstack/openstack-galera-0" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.663879 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcf65\" (UniqueName: \"kubernetes.io/projected/9fd725c7-f12a-4504-a71d-46e7d0258af7-kube-api-access-dcf65\") pod \"openstack-galera-0\" (UID: \"9fd725c7-f12a-4504-a71d-46e7d0258af7\") " pod="openstack/openstack-galera-0" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.726279 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"9fd725c7-f12a-4504-a71d-46e7d0258af7\") " pod="openstack/openstack-galera-0" Dec 05 23:36:12 crc kubenswrapper[4734]: I1205 23:36:12.793287 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.545820 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.551242 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.554846 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-jx8xw" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.555604 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.556564 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.556958 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.571459 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.679979 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3cc9e4dc-431f-4963-911b-f6262ac3c6b5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"3cc9e4dc-431f-4963-911b-f6262ac3c6b5\") " pod="openstack/openstack-cell1-galera-0" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.680055 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cc9e4dc-431f-4963-911b-f6262ac3c6b5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"3cc9e4dc-431f-4963-911b-f6262ac3c6b5\") " pod="openstack/openstack-cell1-galera-0" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.680121 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpf7b\" (UniqueName: \"kubernetes.io/projected/3cc9e4dc-431f-4963-911b-f6262ac3c6b5-kube-api-access-fpf7b\") pod \"openstack-cell1-galera-0\" (UID: \"3cc9e4dc-431f-4963-911b-f6262ac3c6b5\") " pod="openstack/openstack-cell1-galera-0" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.680155 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"3cc9e4dc-431f-4963-911b-f6262ac3c6b5\") " pod="openstack/openstack-cell1-galera-0" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.680178 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cc9e4dc-431f-4963-911b-f6262ac3c6b5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"3cc9e4dc-431f-4963-911b-f6262ac3c6b5\") " pod="openstack/openstack-cell1-galera-0" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.680211 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3cc9e4dc-431f-4963-911b-f6262ac3c6b5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"3cc9e4dc-431f-4963-911b-f6262ac3c6b5\") " pod="openstack/openstack-cell1-galera-0" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.680239 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cc9e4dc-431f-4963-911b-f6262ac3c6b5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"3cc9e4dc-431f-4963-911b-f6262ac3c6b5\") " pod="openstack/openstack-cell1-galera-0" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.680517 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3cc9e4dc-431f-4963-911b-f6262ac3c6b5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"3cc9e4dc-431f-4963-911b-f6262ac3c6b5\") " pod="openstack/openstack-cell1-galera-0" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.784812 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3cc9e4dc-431f-4963-911b-f6262ac3c6b5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"3cc9e4dc-431f-4963-911b-f6262ac3c6b5\") " pod="openstack/openstack-cell1-galera-0" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.784945 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3cc9e4dc-431f-4963-911b-f6262ac3c6b5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"3cc9e4dc-431f-4963-911b-f6262ac3c6b5\") " pod="openstack/openstack-cell1-galera-0" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.785085 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cc9e4dc-431f-4963-911b-f6262ac3c6b5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"3cc9e4dc-431f-4963-911b-f6262ac3c6b5\") " pod="openstack/openstack-cell1-galera-0" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.785158 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpf7b\" (UniqueName: \"kubernetes.io/projected/3cc9e4dc-431f-4963-911b-f6262ac3c6b5-kube-api-access-fpf7b\") pod \"openstack-cell1-galera-0\" (UID: \"3cc9e4dc-431f-4963-911b-f6262ac3c6b5\") " pod="openstack/openstack-cell1-galera-0" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.785246 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"3cc9e4dc-431f-4963-911b-f6262ac3c6b5\") " pod="openstack/openstack-cell1-galera-0" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.785273 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cc9e4dc-431f-4963-911b-f6262ac3c6b5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"3cc9e4dc-431f-4963-911b-f6262ac3c6b5\") " pod="openstack/openstack-cell1-galera-0" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.785385 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3cc9e4dc-431f-4963-911b-f6262ac3c6b5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"3cc9e4dc-431f-4963-911b-f6262ac3c6b5\") " pod="openstack/openstack-cell1-galera-0" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.785450 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cc9e4dc-431f-4963-911b-f6262ac3c6b5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"3cc9e4dc-431f-4963-911b-f6262ac3c6b5\") " pod="openstack/openstack-cell1-galera-0" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.786233 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3cc9e4dc-431f-4963-911b-f6262ac3c6b5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"3cc9e4dc-431f-4963-911b-f6262ac3c6b5\") " pod="openstack/openstack-cell1-galera-0" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.786750 4734 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"3cc9e4dc-431f-4963-911b-f6262ac3c6b5\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-cell1-galera-0" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.786797 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3cc9e4dc-431f-4963-911b-f6262ac3c6b5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"3cc9e4dc-431f-4963-911b-f6262ac3c6b5\") " pod="openstack/openstack-cell1-galera-0" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.788211 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3cc9e4dc-431f-4963-911b-f6262ac3c6b5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"3cc9e4dc-431f-4963-911b-f6262ac3c6b5\") " pod="openstack/openstack-cell1-galera-0" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.788506 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cc9e4dc-431f-4963-911b-f6262ac3c6b5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"3cc9e4dc-431f-4963-911b-f6262ac3c6b5\") " pod="openstack/openstack-cell1-galera-0" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.791508 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cc9e4dc-431f-4963-911b-f6262ac3c6b5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"3cc9e4dc-431f-4963-911b-f6262ac3c6b5\") " pod="openstack/openstack-cell1-galera-0" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.791830 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cc9e4dc-431f-4963-911b-f6262ac3c6b5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"3cc9e4dc-431f-4963-911b-f6262ac3c6b5\") " pod="openstack/openstack-cell1-galera-0" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.825864 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"3cc9e4dc-431f-4963-911b-f6262ac3c6b5\") " pod="openstack/openstack-cell1-galera-0" Dec 05 23:36:13 crc kubenswrapper[4734]: I1205 23:36:13.899256 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpf7b\" (UniqueName: \"kubernetes.io/projected/3cc9e4dc-431f-4963-911b-f6262ac3c6b5-kube-api-access-fpf7b\") pod \"openstack-cell1-galera-0\" (UID: \"3cc9e4dc-431f-4963-911b-f6262ac3c6b5\") " pod="openstack/openstack-cell1-galera-0" Dec 05 23:36:14 crc kubenswrapper[4734]: I1205 23:36:14.039754 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 05 23:36:14 crc kubenswrapper[4734]: I1205 23:36:14.040983 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 05 23:36:14 crc kubenswrapper[4734]: I1205 23:36:14.053035 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 05 23:36:14 crc kubenswrapper[4734]: I1205 23:36:14.053837 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 05 23:36:14 crc kubenswrapper[4734]: I1205 23:36:14.054324 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-pgpfn" Dec 05 23:36:14 crc kubenswrapper[4734]: I1205 23:36:14.054467 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 05 23:36:14 crc kubenswrapper[4734]: I1205 23:36:14.203460 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/61801e1d-6a79-497f-822b-69b683c2f78b-kolla-config\") pod \"memcached-0\" (UID: \"61801e1d-6a79-497f-822b-69b683c2f78b\") " pod="openstack/memcached-0" Dec 05 23:36:14 crc kubenswrapper[4734]: I1205 23:36:14.205729 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/61801e1d-6a79-497f-822b-69b683c2f78b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"61801e1d-6a79-497f-822b-69b683c2f78b\") " pod="openstack/memcached-0" Dec 05 23:36:14 crc kubenswrapper[4734]: I1205 23:36:14.205786 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61801e1d-6a79-497f-822b-69b683c2f78b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"61801e1d-6a79-497f-822b-69b683c2f78b\") " pod="openstack/memcached-0" Dec 05 23:36:14 crc kubenswrapper[4734]: I1205 23:36:14.205865 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/61801e1d-6a79-497f-822b-69b683c2f78b-config-data\") pod \"memcached-0\" (UID: \"61801e1d-6a79-497f-822b-69b683c2f78b\") " pod="openstack/memcached-0" Dec 05 23:36:14 crc kubenswrapper[4734]: I1205 23:36:14.206026 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xc7r\" (UniqueName: \"kubernetes.io/projected/61801e1d-6a79-497f-822b-69b683c2f78b-kube-api-access-5xc7r\") pod \"memcached-0\" (UID: \"61801e1d-6a79-497f-822b-69b683c2f78b\") " pod="openstack/memcached-0" Dec 05 23:36:14 crc kubenswrapper[4734]: I1205 23:36:14.203466 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 05 23:36:14 crc kubenswrapper[4734]: I1205 23:36:14.310153 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xc7r\" (UniqueName: \"kubernetes.io/projected/61801e1d-6a79-497f-822b-69b683c2f78b-kube-api-access-5xc7r\") pod \"memcached-0\" (UID: \"61801e1d-6a79-497f-822b-69b683c2f78b\") " pod="openstack/memcached-0" Dec 05 23:36:14 crc kubenswrapper[4734]: I1205 23:36:14.310277 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/61801e1d-6a79-497f-822b-69b683c2f78b-kolla-config\") pod \"memcached-0\" (UID: \"61801e1d-6a79-497f-822b-69b683c2f78b\") " pod="openstack/memcached-0" Dec 05 23:36:14 crc kubenswrapper[4734]: I1205 23:36:14.310318 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/61801e1d-6a79-497f-822b-69b683c2f78b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"61801e1d-6a79-497f-822b-69b683c2f78b\") " pod="openstack/memcached-0" Dec 05 23:36:14 crc kubenswrapper[4734]: I1205 23:36:14.310369 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61801e1d-6a79-497f-822b-69b683c2f78b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"61801e1d-6a79-497f-822b-69b683c2f78b\") " pod="openstack/memcached-0" Dec 05 23:36:14 crc kubenswrapper[4734]: I1205 23:36:14.310470 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/61801e1d-6a79-497f-822b-69b683c2f78b-config-data\") pod \"memcached-0\" (UID: \"61801e1d-6a79-497f-822b-69b683c2f78b\") " pod="openstack/memcached-0" Dec 05 23:36:14 crc kubenswrapper[4734]: I1205 23:36:14.311801 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/61801e1d-6a79-497f-822b-69b683c2f78b-config-data\") pod \"memcached-0\" (UID: \"61801e1d-6a79-497f-822b-69b683c2f78b\") " pod="openstack/memcached-0" Dec 05 23:36:14 crc kubenswrapper[4734]: I1205 23:36:14.312054 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/61801e1d-6a79-497f-822b-69b683c2f78b-kolla-config\") pod \"memcached-0\" (UID: \"61801e1d-6a79-497f-822b-69b683c2f78b\") " pod="openstack/memcached-0" Dec 05 23:36:14 crc kubenswrapper[4734]: I1205 23:36:14.315137 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/61801e1d-6a79-497f-822b-69b683c2f78b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"61801e1d-6a79-497f-822b-69b683c2f78b\") " pod="openstack/memcached-0" Dec 05 23:36:14 crc kubenswrapper[4734]: I1205 23:36:14.320705 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61801e1d-6a79-497f-822b-69b683c2f78b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"61801e1d-6a79-497f-822b-69b683c2f78b\") " pod="openstack/memcached-0" Dec 05 23:36:14 crc kubenswrapper[4734]: I1205 23:36:14.353201 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xc7r\" (UniqueName: \"kubernetes.io/projected/61801e1d-6a79-497f-822b-69b683c2f78b-kube-api-access-5xc7r\") pod \"memcached-0\" (UID: \"61801e1d-6a79-497f-822b-69b683c2f78b\") " pod="openstack/memcached-0" Dec 05 23:36:14 crc kubenswrapper[4734]: I1205 23:36:14.386202 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 05 23:36:15 crc kubenswrapper[4734]: I1205 23:36:15.709698 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 23:36:15 crc kubenswrapper[4734]: I1205 23:36:15.711842 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 23:36:15 crc kubenswrapper[4734]: I1205 23:36:15.714964 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 23:36:15 crc kubenswrapper[4734]: I1205 23:36:15.742126 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-qkr6s" Dec 05 23:36:15 crc kubenswrapper[4734]: I1205 23:36:15.861632 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fmwq\" (UniqueName: \"kubernetes.io/projected/4cf3a204-9b47-4206-9964-deb892777324-kube-api-access-4fmwq\") pod \"kube-state-metrics-0\" (UID: \"4cf3a204-9b47-4206-9964-deb892777324\") " pod="openstack/kube-state-metrics-0" Dec 05 23:36:15 crc kubenswrapper[4734]: I1205 23:36:15.964219 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fmwq\" (UniqueName: \"kubernetes.io/projected/4cf3a204-9b47-4206-9964-deb892777324-kube-api-access-4fmwq\") pod \"kube-state-metrics-0\" (UID: \"4cf3a204-9b47-4206-9964-deb892777324\") " pod="openstack/kube-state-metrics-0" Dec 05 23:36:16 crc kubenswrapper[4734]: I1205 23:36:16.001295 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fmwq\" (UniqueName: \"kubernetes.io/projected/4cf3a204-9b47-4206-9964-deb892777324-kube-api-access-4fmwq\") pod \"kube-state-metrics-0\" (UID: \"4cf3a204-9b47-4206-9964-deb892777324\") " pod="openstack/kube-state-metrics-0" Dec 05 23:36:16 crc kubenswrapper[4734]: I1205 23:36:16.074247 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 23:36:19 crc kubenswrapper[4734]: I1205 23:36:19.850954 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 23:36:19 crc kubenswrapper[4734]: I1205 23:36:19.853288 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:19 crc kubenswrapper[4734]: I1205 23:36:19.857036 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 05 23:36:19 crc kubenswrapper[4734]: I1205 23:36:19.859438 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-kdfr5" Dec 05 23:36:19 crc kubenswrapper[4734]: I1205 23:36:19.859867 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 05 23:36:19 crc kubenswrapper[4734]: I1205 23:36:19.860064 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 05 23:36:19 crc kubenswrapper[4734]: I1205 23:36:19.865657 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 05 23:36:19 crc kubenswrapper[4734]: I1205 23:36:19.868311 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 23:36:19 crc kubenswrapper[4734]: I1205 23:36:19.956128 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fceddd4-e096-4a7e-875f-756279962334-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3fceddd4-e096-4a7e-875f-756279962334\") " pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:19 crc kubenswrapper[4734]: I1205 23:36:19.956199 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fceddd4-e096-4a7e-875f-756279962334-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3fceddd4-e096-4a7e-875f-756279962334\") " pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:19 crc kubenswrapper[4734]: I1205 23:36:19.956238 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fceddd4-e096-4a7e-875f-756279962334-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3fceddd4-e096-4a7e-875f-756279962334\") " pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:19 crc kubenswrapper[4734]: I1205 23:36:19.956261 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3fceddd4-e096-4a7e-875f-756279962334-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3fceddd4-e096-4a7e-875f-756279962334\") " pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:19 crc kubenswrapper[4734]: I1205 23:36:19.956285 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fceddd4-e096-4a7e-875f-756279962334-config\") pod \"ovsdbserver-nb-0\" (UID: \"3fceddd4-e096-4a7e-875f-756279962334\") " pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:19 crc kubenswrapper[4734]: I1205 23:36:19.956328 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fceddd4-e096-4a7e-875f-756279962334-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3fceddd4-e096-4a7e-875f-756279962334\") " pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:19 crc kubenswrapper[4734]: I1205 23:36:19.956350 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3fceddd4-e096-4a7e-875f-756279962334\") " pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:19 crc kubenswrapper[4734]: I1205 23:36:19.956382 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt4p6\" (UniqueName: \"kubernetes.io/projected/3fceddd4-e096-4a7e-875f-756279962334-kube-api-access-pt4p6\") pod \"ovsdbserver-nb-0\" (UID: \"3fceddd4-e096-4a7e-875f-756279962334\") " pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.057917 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fceddd4-e096-4a7e-875f-756279962334-config\") pod \"ovsdbserver-nb-0\" (UID: \"3fceddd4-e096-4a7e-875f-756279962334\") " pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.057987 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fceddd4-e096-4a7e-875f-756279962334-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3fceddd4-e096-4a7e-875f-756279962334\") " pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.058021 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3fceddd4-e096-4a7e-875f-756279962334\") " pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.058054 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt4p6\" (UniqueName: \"kubernetes.io/projected/3fceddd4-e096-4a7e-875f-756279962334-kube-api-access-pt4p6\") pod \"ovsdbserver-nb-0\" (UID: \"3fceddd4-e096-4a7e-875f-756279962334\") " pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.058094 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fceddd4-e096-4a7e-875f-756279962334-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3fceddd4-e096-4a7e-875f-756279962334\") " pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.058133 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fceddd4-e096-4a7e-875f-756279962334-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3fceddd4-e096-4a7e-875f-756279962334\") " pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.058167 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fceddd4-e096-4a7e-875f-756279962334-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3fceddd4-e096-4a7e-875f-756279962334\") " pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.058187 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3fceddd4-e096-4a7e-875f-756279962334-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3fceddd4-e096-4a7e-875f-756279962334\") " pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.058834 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fceddd4-e096-4a7e-875f-756279962334-config\") pod \"ovsdbserver-nb-0\" (UID: \"3fceddd4-e096-4a7e-875f-756279962334\") " pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.058987 4734 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3fceddd4-e096-4a7e-875f-756279962334\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.059077 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3fceddd4-e096-4a7e-875f-756279962334-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3fceddd4-e096-4a7e-875f-756279962334\") " pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.059955 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fceddd4-e096-4a7e-875f-756279962334-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3fceddd4-e096-4a7e-875f-756279962334\") " pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.065473 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fceddd4-e096-4a7e-875f-756279962334-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3fceddd4-e096-4a7e-875f-756279962334\") " pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.072193 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fceddd4-e096-4a7e-875f-756279962334-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3fceddd4-e096-4a7e-875f-756279962334\") " pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.074580 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fceddd4-e096-4a7e-875f-756279962334-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3fceddd4-e096-4a7e-875f-756279962334\") " pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.080025 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt4p6\" (UniqueName: \"kubernetes.io/projected/3fceddd4-e096-4a7e-875f-756279962334-kube-api-access-pt4p6\") pod \"ovsdbserver-nb-0\" (UID: \"3fceddd4-e096-4a7e-875f-756279962334\") " pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.101869 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3fceddd4-e096-4a7e-875f-756279962334\") " pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.180344 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.444438 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.444551 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.444619 4734 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.445458 4734 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ce94e8a7ce0afa1b302b0a1993b5d90206c505bc6302ab5507859a6eab1dd7e0"} pod="openshift-machine-config-operator/machine-config-daemon-vn94d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.445569 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" containerID="cri-o://ce94e8a7ce0afa1b302b0a1993b5d90206c505bc6302ab5507859a6eab1dd7e0" gracePeriod=600 Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.580891 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-587wk"] Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.582753 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-587wk" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.587071 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-w7qfr" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.587368 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.591592 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.592810 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-587wk"] Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.622466 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-tpdrq"] Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.624853 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tpdrq" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.633229 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tpdrq"] Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.669778 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/625f2253-5867-4d61-a436-264a79c0bd94-scripts\") pod \"ovn-controller-587wk\" (UID: \"625f2253-5867-4d61-a436-264a79c0bd94\") " pod="openstack/ovn-controller-587wk" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.669830 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625f2253-5867-4d61-a436-264a79c0bd94-combined-ca-bundle\") pod \"ovn-controller-587wk\" (UID: \"625f2253-5867-4d61-a436-264a79c0bd94\") " pod="openstack/ovn-controller-587wk" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.669850 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmzh8\" (UniqueName: \"kubernetes.io/projected/625f2253-5867-4d61-a436-264a79c0bd94-kube-api-access-rmzh8\") pod \"ovn-controller-587wk\" (UID: \"625f2253-5867-4d61-a436-264a79c0bd94\") " pod="openstack/ovn-controller-587wk" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.670045 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/625f2253-5867-4d61-a436-264a79c0bd94-var-run-ovn\") pod \"ovn-controller-587wk\" (UID: \"625f2253-5867-4d61-a436-264a79c0bd94\") " pod="openstack/ovn-controller-587wk" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.670237 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/625f2253-5867-4d61-a436-264a79c0bd94-ovn-controller-tls-certs\") pod \"ovn-controller-587wk\" (UID: \"625f2253-5867-4d61-a436-264a79c0bd94\") " pod="openstack/ovn-controller-587wk" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.670341 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/625f2253-5867-4d61-a436-264a79c0bd94-var-run\") pod \"ovn-controller-587wk\" (UID: \"625f2253-5867-4d61-a436-264a79c0bd94\") " pod="openstack/ovn-controller-587wk" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.670394 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/625f2253-5867-4d61-a436-264a79c0bd94-var-log-ovn\") pod \"ovn-controller-587wk\" (UID: \"625f2253-5867-4d61-a436-264a79c0bd94\") " pod="openstack/ovn-controller-587wk" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.772266 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkrrw\" (UniqueName: \"kubernetes.io/projected/9631bcf5-05df-4e1d-b849-7352ef35013f-kube-api-access-nkrrw\") pod \"ovn-controller-ovs-tpdrq\" (UID: \"9631bcf5-05df-4e1d-b849-7352ef35013f\") " pod="openstack/ovn-controller-ovs-tpdrq" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.772341 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9631bcf5-05df-4e1d-b849-7352ef35013f-scripts\") pod \"ovn-controller-ovs-tpdrq\" (UID: \"9631bcf5-05df-4e1d-b849-7352ef35013f\") " pod="openstack/ovn-controller-ovs-tpdrq" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.772412 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9631bcf5-05df-4e1d-b849-7352ef35013f-etc-ovs\") pod \"ovn-controller-ovs-tpdrq\" (UID: \"9631bcf5-05df-4e1d-b849-7352ef35013f\") " pod="openstack/ovn-controller-ovs-tpdrq" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.772458 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9631bcf5-05df-4e1d-b849-7352ef35013f-var-run\") pod \"ovn-controller-ovs-tpdrq\" (UID: \"9631bcf5-05df-4e1d-b849-7352ef35013f\") " pod="openstack/ovn-controller-ovs-tpdrq" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.772501 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/625f2253-5867-4d61-a436-264a79c0bd94-scripts\") pod \"ovn-controller-587wk\" (UID: \"625f2253-5867-4d61-a436-264a79c0bd94\") " pod="openstack/ovn-controller-587wk" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.772564 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625f2253-5867-4d61-a436-264a79c0bd94-combined-ca-bundle\") pod \"ovn-controller-587wk\" (UID: \"625f2253-5867-4d61-a436-264a79c0bd94\") " pod="openstack/ovn-controller-587wk" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.772595 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmzh8\" (UniqueName: \"kubernetes.io/projected/625f2253-5867-4d61-a436-264a79c0bd94-kube-api-access-rmzh8\") pod \"ovn-controller-587wk\" (UID: \"625f2253-5867-4d61-a436-264a79c0bd94\") " pod="openstack/ovn-controller-587wk" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.772639 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9631bcf5-05df-4e1d-b849-7352ef35013f-var-lib\") pod \"ovn-controller-ovs-tpdrq\" (UID: \"9631bcf5-05df-4e1d-b849-7352ef35013f\") " pod="openstack/ovn-controller-ovs-tpdrq" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.772657 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/625f2253-5867-4d61-a436-264a79c0bd94-var-run-ovn\") pod \"ovn-controller-587wk\" (UID: \"625f2253-5867-4d61-a436-264a79c0bd94\") " pod="openstack/ovn-controller-587wk" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.772677 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9631bcf5-05df-4e1d-b849-7352ef35013f-var-log\") pod \"ovn-controller-ovs-tpdrq\" (UID: \"9631bcf5-05df-4e1d-b849-7352ef35013f\") " pod="openstack/ovn-controller-ovs-tpdrq" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.772714 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/625f2253-5867-4d61-a436-264a79c0bd94-ovn-controller-tls-certs\") pod \"ovn-controller-587wk\" (UID: \"625f2253-5867-4d61-a436-264a79c0bd94\") " pod="openstack/ovn-controller-587wk" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.772747 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/625f2253-5867-4d61-a436-264a79c0bd94-var-run\") pod \"ovn-controller-587wk\" (UID: \"625f2253-5867-4d61-a436-264a79c0bd94\") " pod="openstack/ovn-controller-587wk" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.772864 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/625f2253-5867-4d61-a436-264a79c0bd94-var-log-ovn\") pod \"ovn-controller-587wk\" (UID: \"625f2253-5867-4d61-a436-264a79c0bd94\") " pod="openstack/ovn-controller-587wk" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.773316 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/625f2253-5867-4d61-a436-264a79c0bd94-var-run\") pod \"ovn-controller-587wk\" (UID: \"625f2253-5867-4d61-a436-264a79c0bd94\") " pod="openstack/ovn-controller-587wk" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.773453 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/625f2253-5867-4d61-a436-264a79c0bd94-var-log-ovn\") pod \"ovn-controller-587wk\" (UID: \"625f2253-5867-4d61-a436-264a79c0bd94\") " pod="openstack/ovn-controller-587wk" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.773457 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/625f2253-5867-4d61-a436-264a79c0bd94-var-run-ovn\") pod \"ovn-controller-587wk\" (UID: \"625f2253-5867-4d61-a436-264a79c0bd94\") " pod="openstack/ovn-controller-587wk" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.776055 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/625f2253-5867-4d61-a436-264a79c0bd94-scripts\") pod \"ovn-controller-587wk\" (UID: \"625f2253-5867-4d61-a436-264a79c0bd94\") " pod="openstack/ovn-controller-587wk" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.777965 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625f2253-5867-4d61-a436-264a79c0bd94-combined-ca-bundle\") pod \"ovn-controller-587wk\" (UID: \"625f2253-5867-4d61-a436-264a79c0bd94\") " pod="openstack/ovn-controller-587wk" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.783055 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/625f2253-5867-4d61-a436-264a79c0bd94-ovn-controller-tls-certs\") pod \"ovn-controller-587wk\" (UID: \"625f2253-5867-4d61-a436-264a79c0bd94\") " pod="openstack/ovn-controller-587wk" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.803360 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmzh8\" (UniqueName: \"kubernetes.io/projected/625f2253-5867-4d61-a436-264a79c0bd94-kube-api-access-rmzh8\") pod \"ovn-controller-587wk\" (UID: \"625f2253-5867-4d61-a436-264a79c0bd94\") " pod="openstack/ovn-controller-587wk" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.875212 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9631bcf5-05df-4e1d-b849-7352ef35013f-var-lib\") pod \"ovn-controller-ovs-tpdrq\" (UID: \"9631bcf5-05df-4e1d-b849-7352ef35013f\") " pod="openstack/ovn-controller-ovs-tpdrq" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.875971 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9631bcf5-05df-4e1d-b849-7352ef35013f-var-log\") pod \"ovn-controller-ovs-tpdrq\" (UID: \"9631bcf5-05df-4e1d-b849-7352ef35013f\") " pod="openstack/ovn-controller-ovs-tpdrq" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.876254 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9631bcf5-05df-4e1d-b849-7352ef35013f-var-log\") pod \"ovn-controller-ovs-tpdrq\" (UID: \"9631bcf5-05df-4e1d-b849-7352ef35013f\") " pod="openstack/ovn-controller-ovs-tpdrq" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.876314 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9631bcf5-05df-4e1d-b849-7352ef35013f-var-lib\") pod \"ovn-controller-ovs-tpdrq\" (UID: \"9631bcf5-05df-4e1d-b849-7352ef35013f\") " pod="openstack/ovn-controller-ovs-tpdrq" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.876459 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkrrw\" (UniqueName: \"kubernetes.io/projected/9631bcf5-05df-4e1d-b849-7352ef35013f-kube-api-access-nkrrw\") pod \"ovn-controller-ovs-tpdrq\" (UID: \"9631bcf5-05df-4e1d-b849-7352ef35013f\") " pod="openstack/ovn-controller-ovs-tpdrq" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.876500 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9631bcf5-05df-4e1d-b849-7352ef35013f-scripts\") pod \"ovn-controller-ovs-tpdrq\" (UID: \"9631bcf5-05df-4e1d-b849-7352ef35013f\") " pod="openstack/ovn-controller-ovs-tpdrq" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.876598 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9631bcf5-05df-4e1d-b849-7352ef35013f-etc-ovs\") pod \"ovn-controller-ovs-tpdrq\" (UID: \"9631bcf5-05df-4e1d-b849-7352ef35013f\") " pod="openstack/ovn-controller-ovs-tpdrq" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.876638 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9631bcf5-05df-4e1d-b849-7352ef35013f-var-run\") pod \"ovn-controller-ovs-tpdrq\" (UID: \"9631bcf5-05df-4e1d-b849-7352ef35013f\") " pod="openstack/ovn-controller-ovs-tpdrq" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.876798 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9631bcf5-05df-4e1d-b849-7352ef35013f-var-run\") pod \"ovn-controller-ovs-tpdrq\" (UID: \"9631bcf5-05df-4e1d-b849-7352ef35013f\") " pod="openstack/ovn-controller-ovs-tpdrq" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.877442 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9631bcf5-05df-4e1d-b849-7352ef35013f-etc-ovs\") pod \"ovn-controller-ovs-tpdrq\" (UID: \"9631bcf5-05df-4e1d-b849-7352ef35013f\") " pod="openstack/ovn-controller-ovs-tpdrq" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.879397 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9631bcf5-05df-4e1d-b849-7352ef35013f-scripts\") pod \"ovn-controller-ovs-tpdrq\" (UID: \"9631bcf5-05df-4e1d-b849-7352ef35013f\") " pod="openstack/ovn-controller-ovs-tpdrq" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.901417 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkrrw\" (UniqueName: \"kubernetes.io/projected/9631bcf5-05df-4e1d-b849-7352ef35013f-kube-api-access-nkrrw\") pod \"ovn-controller-ovs-tpdrq\" (UID: \"9631bcf5-05df-4e1d-b849-7352ef35013f\") " pod="openstack/ovn-controller-ovs-tpdrq" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.906917 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-587wk" Dec 05 23:36:20 crc kubenswrapper[4734]: I1205 23:36:20.942452 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tpdrq" Dec 05 23:36:21 crc kubenswrapper[4734]: I1205 23:36:21.685123 4734 generic.go:334] "Generic (PLEG): container finished" podID="65758270-a7a7-46b5-af95-0588daf9fa86" containerID="ce94e8a7ce0afa1b302b0a1993b5d90206c505bc6302ab5507859a6eab1dd7e0" exitCode=0 Dec 05 23:36:21 crc kubenswrapper[4734]: I1205 23:36:21.685177 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerDied","Data":"ce94e8a7ce0afa1b302b0a1993b5d90206c505bc6302ab5507859a6eab1dd7e0"} Dec 05 23:36:21 crc kubenswrapper[4734]: I1205 23:36:21.685219 4734 scope.go:117] "RemoveContainer" containerID="8e69331b125e1151d942b08cb111e9d9d1598a8f70aacd7d59fba49b1cd48af6" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.419869 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.423277 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.429723 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.429736 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.429879 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.430211 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-8tvmp" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.433696 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.536018 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350\") " pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.536081 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350\") " pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.536116 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350-config\") pod \"ovsdbserver-sb-0\" (UID: \"2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350\") " pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.536151 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350\") " pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.536223 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wbkw\" (UniqueName: \"kubernetes.io/projected/2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350-kube-api-access-2wbkw\") pod \"ovsdbserver-sb-0\" (UID: \"2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350\") " pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.536265 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350\") " pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.536282 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350\") " pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.536303 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350\") " pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.644751 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350\") " pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.644806 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350\") " pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.644826 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350-config\") pod \"ovsdbserver-sb-0\" (UID: \"2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350\") " pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.644852 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350\") " pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.644926 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wbkw\" (UniqueName: \"kubernetes.io/projected/2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350-kube-api-access-2wbkw\") pod \"ovsdbserver-sb-0\" (UID: \"2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350\") " pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.645337 4734 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.645794 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350\") " pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.646056 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350-config\") pod \"ovsdbserver-sb-0\" (UID: \"2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350\") " pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.645384 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350\") " pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.650027 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350\") " pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.650071 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350\") " pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.650939 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350\") " pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.651726 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350\") " pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.655035 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350\") " pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.659721 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350\") " pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.663121 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wbkw\" (UniqueName: \"kubernetes.io/projected/2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350-kube-api-access-2wbkw\") pod \"ovsdbserver-sb-0\" (UID: \"2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350\") " pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.669622 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350\") " pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:23 crc kubenswrapper[4734]: I1205 23:36:23.763342 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:39 crc kubenswrapper[4734]: E1205 23:36:39.051661 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 05 23:36:39 crc kubenswrapper[4734]: E1205 23:36:39.052483 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rvnnn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(c35eaa12-d993-4769-975b-35a5ac6609e0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 23:36:39 crc kubenswrapper[4734]: E1205 23:36:39.053665 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="c35eaa12-d993-4769-975b-35a5ac6609e0" Dec 05 23:36:39 crc kubenswrapper[4734]: E1205 23:36:39.830095 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 05 23:36:39 crc kubenswrapper[4734]: E1205 23:36:39.830895 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2xddp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-fvnv4_openstack(a30f19aa-415d-4608-ac2a-7d52751225d7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 23:36:39 crc kubenswrapper[4734]: E1205 23:36:39.830643 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 05 23:36:39 crc kubenswrapper[4734]: E1205 23:36:39.831063 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ptd7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-rb2ss_openstack(1826ecde-f68b-4010-bb28-aab705498c88): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 23:36:39 crc kubenswrapper[4734]: E1205 23:36:39.832213 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-rb2ss" podUID="1826ecde-f68b-4010-bb28-aab705498c88" Dec 05 23:36:39 crc kubenswrapper[4734]: E1205 23:36:39.832204 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-fvnv4" podUID="a30f19aa-415d-4608-ac2a-7d52751225d7" Dec 05 23:36:39 crc kubenswrapper[4734]: E1205 23:36:39.857955 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 05 23:36:39 crc kubenswrapper[4734]: E1205 23:36:39.858504 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-878pv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-f684s_openstack(355a394a-ee81-463e-8d82-b6c789ad6361): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 23:36:39 crc kubenswrapper[4734]: E1205 23:36:39.860298 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-f684s" podUID="355a394a-ee81-463e-8d82-b6c789ad6361" Dec 05 23:36:39 crc kubenswrapper[4734]: E1205 23:36:39.866777 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 05 23:36:39 crc kubenswrapper[4734]: E1205 23:36:39.867002 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lbz4p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-zpj4b_openstack(96d53662-9e9d-4205-9a4b-23eea707f724): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 23:36:39 crc kubenswrapper[4734]: E1205 23:36:39.870698 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-zpj4b" podUID="96d53662-9e9d-4205-9a4b-23eea707f724" Dec 05 23:36:39 crc kubenswrapper[4734]: E1205 23:36:39.903977 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-f684s" podUID="355a394a-ee81-463e-8d82-b6c789ad6361" Dec 05 23:36:39 crc kubenswrapper[4734]: E1205 23:36:39.904031 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-rb2ss" podUID="1826ecde-f68b-4010-bb28-aab705498c88" Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.325404 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 05 23:36:40 crc kubenswrapper[4734]: W1205 23:36:40.332401 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61801e1d_6a79_497f_822b_69b683c2f78b.slice/crio-fdf863be8fc6425c77f1de0a3ff72247c129e11f2d46132521b8492db73b36d8 WatchSource:0}: Error finding container fdf863be8fc6425c77f1de0a3ff72247c129e11f2d46132521b8492db73b36d8: Status 404 returned error can't find the container with id fdf863be8fc6425c77f1de0a3ff72247c129e11f2d46132521b8492db73b36d8 Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.557215 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.578402 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 23:36:40 crc kubenswrapper[4734]: W1205 23:36:40.609385 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fd725c7_f12a_4504_a71d_46e7d0258af7.slice/crio-c70bb13cee5df7baacb4ab421ba364103d79308731cc0977015e17db4e249e40 WatchSource:0}: Error finding container c70bb13cee5df7baacb4ab421ba364103d79308731cc0977015e17db4e249e40: Status 404 returned error can't find the container with id c70bb13cee5df7baacb4ab421ba364103d79308731cc0977015e17db4e249e40 Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.668540 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zpj4b" Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.684479 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fvnv4" Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.803571 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 23:36:40 crc kubenswrapper[4734]: W1205 23:36:40.810465 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fceddd4_e096_4a7e_875f_756279962334.slice/crio-f2347b6946bf7fb3a29e444dcbf84d62ab7421266828f7f099e49a4bcb42a340 WatchSource:0}: Error finding container f2347b6946bf7fb3a29e444dcbf84d62ab7421266828f7f099e49a4bcb42a340: Status 404 returned error can't find the container with id f2347b6946bf7fb3a29e444dcbf84d62ab7421266828f7f099e49a4bcb42a340 Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.824057 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a30f19aa-415d-4608-ac2a-7d52751225d7-config\") pod \"a30f19aa-415d-4608-ac2a-7d52751225d7\" (UID: \"a30f19aa-415d-4608-ac2a-7d52751225d7\") " Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.824121 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96d53662-9e9d-4205-9a4b-23eea707f724-dns-svc\") pod \"96d53662-9e9d-4205-9a4b-23eea707f724\" (UID: \"96d53662-9e9d-4205-9a4b-23eea707f724\") " Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.824206 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xddp\" (UniqueName: \"kubernetes.io/projected/a30f19aa-415d-4608-ac2a-7d52751225d7-kube-api-access-2xddp\") pod \"a30f19aa-415d-4608-ac2a-7d52751225d7\" (UID: \"a30f19aa-415d-4608-ac2a-7d52751225d7\") " Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.824302 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbz4p\" (UniqueName: \"kubernetes.io/projected/96d53662-9e9d-4205-9a4b-23eea707f724-kube-api-access-lbz4p\") pod \"96d53662-9e9d-4205-9a4b-23eea707f724\" (UID: \"96d53662-9e9d-4205-9a4b-23eea707f724\") " Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.824970 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a30f19aa-415d-4608-ac2a-7d52751225d7-config" (OuterVolumeSpecName: "config") pod "a30f19aa-415d-4608-ac2a-7d52751225d7" (UID: "a30f19aa-415d-4608-ac2a-7d52751225d7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.826125 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96d53662-9e9d-4205-9a4b-23eea707f724-config\") pod \"96d53662-9e9d-4205-9a4b-23eea707f724\" (UID: \"96d53662-9e9d-4205-9a4b-23eea707f724\") " Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.826633 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a30f19aa-415d-4608-ac2a-7d52751225d7-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.826624 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96d53662-9e9d-4205-9a4b-23eea707f724-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "96d53662-9e9d-4205-9a4b-23eea707f724" (UID: "96d53662-9e9d-4205-9a4b-23eea707f724"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.826676 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96d53662-9e9d-4205-9a4b-23eea707f724-config" (OuterVolumeSpecName: "config") pod "96d53662-9e9d-4205-9a4b-23eea707f724" (UID: "96d53662-9e9d-4205-9a4b-23eea707f724"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.836238 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a30f19aa-415d-4608-ac2a-7d52751225d7-kube-api-access-2xddp" (OuterVolumeSpecName: "kube-api-access-2xddp") pod "a30f19aa-415d-4608-ac2a-7d52751225d7" (UID: "a30f19aa-415d-4608-ac2a-7d52751225d7"). InnerVolumeSpecName "kube-api-access-2xddp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.836399 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96d53662-9e9d-4205-9a4b-23eea707f724-kube-api-access-lbz4p" (OuterVolumeSpecName: "kube-api-access-lbz4p") pod "96d53662-9e9d-4205-9a4b-23eea707f724" (UID: "96d53662-9e9d-4205-9a4b-23eea707f724"). InnerVolumeSpecName "kube-api-access-lbz4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.864504 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-587wk"] Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.890398 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.903114 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fvnv4" Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.903268 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-fvnv4" event={"ID":"a30f19aa-415d-4608-ac2a-7d52751225d7","Type":"ContainerDied","Data":"b89b92adec2f93d31ca261d54c0fd9e431dca8b8e749b4371bb076223e2d87ce"} Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.904748 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4cf3a204-9b47-4206-9964-deb892777324","Type":"ContainerStarted","Data":"2c9738390b425ae10b5dae6687a99e02ab70086f7e3cfae786cd759a5d285b40"} Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.907215 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-587wk" event={"ID":"625f2253-5867-4d61-a436-264a79c0bd94","Type":"ContainerStarted","Data":"cce039d6ccd1b37f8138878cc6150c9ff2c11951b3b5d05a226f5abb00f01659"} Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.908307 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-zpj4b" event={"ID":"96d53662-9e9d-4205-9a4b-23eea707f724","Type":"ContainerDied","Data":"4cfa82277d99d4ded5415167d1ca5acad1682b9d6a1ae46bb6f755b939e35c30"} Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.908340 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zpj4b" Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.909662 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9fd725c7-f12a-4504-a71d-46e7d0258af7","Type":"ContainerStarted","Data":"c70bb13cee5df7baacb4ab421ba364103d79308731cc0977015e17db4e249e40"} Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.913959 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"61801e1d-6a79-497f-822b-69b683c2f78b","Type":"ContainerStarted","Data":"fdf863be8fc6425c77f1de0a3ff72247c129e11f2d46132521b8492db73b36d8"} Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.916362 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerStarted","Data":"5119dd9005e526fae1b15071e6d704440bd8834afc5ec6ce50aaa9f27c74ff90"} Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.917470 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3fceddd4-e096-4a7e-875f-756279962334","Type":"ContainerStarted","Data":"f2347b6946bf7fb3a29e444dcbf84d62ab7421266828f7f099e49a4bcb42a340"} Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.927999 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xddp\" (UniqueName: \"kubernetes.io/projected/a30f19aa-415d-4608-ac2a-7d52751225d7-kube-api-access-2xddp\") on node \"crc\" DevicePath \"\"" Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.928051 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbz4p\" (UniqueName: \"kubernetes.io/projected/96d53662-9e9d-4205-9a4b-23eea707f724-kube-api-access-lbz4p\") on node \"crc\" DevicePath \"\"" Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.928061 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96d53662-9e9d-4205-9a4b-23eea707f724-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.928073 4734 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96d53662-9e9d-4205-9a4b-23eea707f724-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 23:36:40 crc kubenswrapper[4734]: I1205 23:36:40.989326 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fvnv4"] Dec 05 23:36:41 crc kubenswrapper[4734]: I1205 23:36:41.013131 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fvnv4"] Dec 05 23:36:41 crc kubenswrapper[4734]: I1205 23:36:41.047250 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tpdrq"] Dec 05 23:36:41 crc kubenswrapper[4734]: I1205 23:36:41.073210 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zpj4b"] Dec 05 23:36:41 crc kubenswrapper[4734]: I1205 23:36:41.082570 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zpj4b"] Dec 05 23:36:41 crc kubenswrapper[4734]: I1205 23:36:41.505477 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 23:36:41 crc kubenswrapper[4734]: W1205 23:36:41.582110 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f3dcdbf_2c38_4e2a_9420_c2d7f9b75350.slice/crio-cb6c75a7fbeda3d2b66a689dd5d52f1915e2ada4e5342374f9151a12eeb8e979 WatchSource:0}: Error finding container cb6c75a7fbeda3d2b66a689dd5d52f1915e2ada4e5342374f9151a12eeb8e979: Status 404 returned error can't find the container with id cb6c75a7fbeda3d2b66a689dd5d52f1915e2ada4e5342374f9151a12eeb8e979 Dec 05 23:36:41 crc kubenswrapper[4734]: I1205 23:36:41.630057 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96d53662-9e9d-4205-9a4b-23eea707f724" path="/var/lib/kubelet/pods/96d53662-9e9d-4205-9a4b-23eea707f724/volumes" Dec 05 23:36:41 crc kubenswrapper[4734]: I1205 23:36:41.630741 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a30f19aa-415d-4608-ac2a-7d52751225d7" path="/var/lib/kubelet/pods/a30f19aa-415d-4608-ac2a-7d52751225d7/volumes" Dec 05 23:36:41 crc kubenswrapper[4734]: I1205 23:36:41.940714 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tpdrq" event={"ID":"9631bcf5-05df-4e1d-b849-7352ef35013f","Type":"ContainerStarted","Data":"585da5b1e92eebe1c81908059931f1d9db8325ac1c3cebe0d5aea78f87660697"} Dec 05 23:36:41 crc kubenswrapper[4734]: I1205 23:36:41.944101 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ed95027c-1ded-4127-a341-7ee81018d4b6","Type":"ContainerStarted","Data":"66e5a249cf9e8b0a22292ba791d1aa360ef84159f879636f2af4ddcac64c1e31"} Dec 05 23:36:41 crc kubenswrapper[4734]: I1205 23:36:41.946987 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"3cc9e4dc-431f-4963-911b-f6262ac3c6b5","Type":"ContainerStarted","Data":"092726cffdbde04f05517ef6759fcb458ef62d921aacb364cd3bf7435073bd18"} Dec 05 23:36:41 crc kubenswrapper[4734]: I1205 23:36:41.949748 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350","Type":"ContainerStarted","Data":"cb6c75a7fbeda3d2b66a689dd5d52f1915e2ada4e5342374f9151a12eeb8e979"} Dec 05 23:36:41 crc kubenswrapper[4734]: I1205 23:36:41.954755 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c35eaa12-d993-4769-975b-35a5ac6609e0","Type":"ContainerStarted","Data":"c5950e0647bc09668aa5bb962f9d7316a20f3b0c1e9e76366f98c21ef1804c9e"} Dec 05 23:36:43 crc kubenswrapper[4734]: I1205 23:36:43.955466 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-cpzs4"] Dec 05 23:36:43 crc kubenswrapper[4734]: I1205 23:36:43.957625 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-cpzs4" Dec 05 23:36:43 crc kubenswrapper[4734]: I1205 23:36:43.960726 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 05 23:36:43 crc kubenswrapper[4734]: I1205 23:36:43.969646 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-cpzs4"] Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.097490 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b246fed6-9a79-4d72-a73a-943b13d8e30b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-cpzs4\" (UID: \"b246fed6-9a79-4d72-a73a-943b13d8e30b\") " pod="openstack/ovn-controller-metrics-cpzs4" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.097565 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b246fed6-9a79-4d72-a73a-943b13d8e30b-ovn-rundir\") pod \"ovn-controller-metrics-cpzs4\" (UID: \"b246fed6-9a79-4d72-a73a-943b13d8e30b\") " pod="openstack/ovn-controller-metrics-cpzs4" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.097608 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b246fed6-9a79-4d72-a73a-943b13d8e30b-ovs-rundir\") pod \"ovn-controller-metrics-cpzs4\" (UID: \"b246fed6-9a79-4d72-a73a-943b13d8e30b\") " pod="openstack/ovn-controller-metrics-cpzs4" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.097753 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b246fed6-9a79-4d72-a73a-943b13d8e30b-combined-ca-bundle\") pod \"ovn-controller-metrics-cpzs4\" (UID: \"b246fed6-9a79-4d72-a73a-943b13d8e30b\") " pod="openstack/ovn-controller-metrics-cpzs4" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.097939 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b246fed6-9a79-4d72-a73a-943b13d8e30b-config\") pod \"ovn-controller-metrics-cpzs4\" (UID: \"b246fed6-9a79-4d72-a73a-943b13d8e30b\") " pod="openstack/ovn-controller-metrics-cpzs4" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.098174 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8kkw\" (UniqueName: \"kubernetes.io/projected/b246fed6-9a79-4d72-a73a-943b13d8e30b-kube-api-access-r8kkw\") pod \"ovn-controller-metrics-cpzs4\" (UID: \"b246fed6-9a79-4d72-a73a-943b13d8e30b\") " pod="openstack/ovn-controller-metrics-cpzs4" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.131156 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-f684s"] Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.202074 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b246fed6-9a79-4d72-a73a-943b13d8e30b-ovs-rundir\") pod \"ovn-controller-metrics-cpzs4\" (UID: \"b246fed6-9a79-4d72-a73a-943b13d8e30b\") " pod="openstack/ovn-controller-metrics-cpzs4" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.202449 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b246fed6-9a79-4d72-a73a-943b13d8e30b-combined-ca-bundle\") pod \"ovn-controller-metrics-cpzs4\" (UID: \"b246fed6-9a79-4d72-a73a-943b13d8e30b\") " pod="openstack/ovn-controller-metrics-cpzs4" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.202635 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b246fed6-9a79-4d72-a73a-943b13d8e30b-config\") pod \"ovn-controller-metrics-cpzs4\" (UID: \"b246fed6-9a79-4d72-a73a-943b13d8e30b\") " pod="openstack/ovn-controller-metrics-cpzs4" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.202763 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8kkw\" (UniqueName: \"kubernetes.io/projected/b246fed6-9a79-4d72-a73a-943b13d8e30b-kube-api-access-r8kkw\") pod \"ovn-controller-metrics-cpzs4\" (UID: \"b246fed6-9a79-4d72-a73a-943b13d8e30b\") " pod="openstack/ovn-controller-metrics-cpzs4" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.202878 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b246fed6-9a79-4d72-a73a-943b13d8e30b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-cpzs4\" (UID: \"b246fed6-9a79-4d72-a73a-943b13d8e30b\") " pod="openstack/ovn-controller-metrics-cpzs4" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.203004 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b246fed6-9a79-4d72-a73a-943b13d8e30b-ovn-rundir\") pod \"ovn-controller-metrics-cpzs4\" (UID: \"b246fed6-9a79-4d72-a73a-943b13d8e30b\") " pod="openstack/ovn-controller-metrics-cpzs4" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.205131 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b246fed6-9a79-4d72-a73a-943b13d8e30b-ovn-rundir\") pod \"ovn-controller-metrics-cpzs4\" (UID: \"b246fed6-9a79-4d72-a73a-943b13d8e30b\") " pod="openstack/ovn-controller-metrics-cpzs4" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.205478 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b246fed6-9a79-4d72-a73a-943b13d8e30b-ovs-rundir\") pod \"ovn-controller-metrics-cpzs4\" (UID: \"b246fed6-9a79-4d72-a73a-943b13d8e30b\") " pod="openstack/ovn-controller-metrics-cpzs4" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.213340 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b246fed6-9a79-4d72-a73a-943b13d8e30b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-cpzs4\" (UID: \"b246fed6-9a79-4d72-a73a-943b13d8e30b\") " pod="openstack/ovn-controller-metrics-cpzs4" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.213464 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b246fed6-9a79-4d72-a73a-943b13d8e30b-combined-ca-bundle\") pod \"ovn-controller-metrics-cpzs4\" (UID: \"b246fed6-9a79-4d72-a73a-943b13d8e30b\") " pod="openstack/ovn-controller-metrics-cpzs4" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.218839 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b246fed6-9a79-4d72-a73a-943b13d8e30b-config\") pod \"ovn-controller-metrics-cpzs4\" (UID: \"b246fed6-9a79-4d72-a73a-943b13d8e30b\") " pod="openstack/ovn-controller-metrics-cpzs4" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.237185 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-t4slg"] Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.237898 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8kkw\" (UniqueName: \"kubernetes.io/projected/b246fed6-9a79-4d72-a73a-943b13d8e30b-kube-api-access-r8kkw\") pod \"ovn-controller-metrics-cpzs4\" (UID: \"b246fed6-9a79-4d72-a73a-943b13d8e30b\") " pod="openstack/ovn-controller-metrics-cpzs4" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.245062 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-t4slg" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.264243 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-t4slg"] Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.276623 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.304699 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5bj2\" (UniqueName: \"kubernetes.io/projected/b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7-kube-api-access-r5bj2\") pod \"dnsmasq-dns-7fd796d7df-t4slg\" (UID: \"b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7\") " pod="openstack/dnsmasq-dns-7fd796d7df-t4slg" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.304762 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-t4slg\" (UID: \"b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7\") " pod="openstack/dnsmasq-dns-7fd796d7df-t4slg" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.304794 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7-config\") pod \"dnsmasq-dns-7fd796d7df-t4slg\" (UID: \"b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7\") " pod="openstack/dnsmasq-dns-7fd796d7df-t4slg" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.304832 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-t4slg\" (UID: \"b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7\") " pod="openstack/dnsmasq-dns-7fd796d7df-t4slg" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.324099 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-cpzs4" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.395106 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-rb2ss"] Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.407729 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5bj2\" (UniqueName: \"kubernetes.io/projected/b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7-kube-api-access-r5bj2\") pod \"dnsmasq-dns-7fd796d7df-t4slg\" (UID: \"b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7\") " pod="openstack/dnsmasq-dns-7fd796d7df-t4slg" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.409466 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-t4slg\" (UID: \"b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7\") " pod="openstack/dnsmasq-dns-7fd796d7df-t4slg" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.417675 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-t4slg\" (UID: \"b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7\") " pod="openstack/dnsmasq-dns-7fd796d7df-t4slg" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.417884 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7-config\") pod \"dnsmasq-dns-7fd796d7df-t4slg\" (UID: \"b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7\") " pod="openstack/dnsmasq-dns-7fd796d7df-t4slg" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.418067 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-t4slg\" (UID: \"b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7\") " pod="openstack/dnsmasq-dns-7fd796d7df-t4slg" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.419179 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7-config\") pod \"dnsmasq-dns-7fd796d7df-t4slg\" (UID: \"b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7\") " pod="openstack/dnsmasq-dns-7fd796d7df-t4slg" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.419196 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-c9btw"] Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.421722 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-c9btw" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.424356 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-t4slg\" (UID: \"b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7\") " pod="openstack/dnsmasq-dns-7fd796d7df-t4slg" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.451254 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.462883 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5bj2\" (UniqueName: \"kubernetes.io/projected/b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7-kube-api-access-r5bj2\") pod \"dnsmasq-dns-7fd796d7df-t4slg\" (UID: \"b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7\") " pod="openstack/dnsmasq-dns-7fd796d7df-t4slg" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.465162 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-c9btw"] Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.520331 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn9bj\" (UniqueName: \"kubernetes.io/projected/115aa742-8de4-4cb2-84e2-c05f698eda5e-kube-api-access-vn9bj\") pod \"dnsmasq-dns-86db49b7ff-c9btw\" (UID: \"115aa742-8de4-4cb2-84e2-c05f698eda5e\") " pod="openstack/dnsmasq-dns-86db49b7ff-c9btw" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.520400 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/115aa742-8de4-4cb2-84e2-c05f698eda5e-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-c9btw\" (UID: \"115aa742-8de4-4cb2-84e2-c05f698eda5e\") " pod="openstack/dnsmasq-dns-86db49b7ff-c9btw" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.520445 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/115aa742-8de4-4cb2-84e2-c05f698eda5e-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-c9btw\" (UID: \"115aa742-8de4-4cb2-84e2-c05f698eda5e\") " pod="openstack/dnsmasq-dns-86db49b7ff-c9btw" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.520483 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/115aa742-8de4-4cb2-84e2-c05f698eda5e-config\") pod \"dnsmasq-dns-86db49b7ff-c9btw\" (UID: \"115aa742-8de4-4cb2-84e2-c05f698eda5e\") " pod="openstack/dnsmasq-dns-86db49b7ff-c9btw" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.520579 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/115aa742-8de4-4cb2-84e2-c05f698eda5e-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-c9btw\" (UID: \"115aa742-8de4-4cb2-84e2-c05f698eda5e\") " pod="openstack/dnsmasq-dns-86db49b7ff-c9btw" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.622065 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn9bj\" (UniqueName: \"kubernetes.io/projected/115aa742-8de4-4cb2-84e2-c05f698eda5e-kube-api-access-vn9bj\") pod \"dnsmasq-dns-86db49b7ff-c9btw\" (UID: \"115aa742-8de4-4cb2-84e2-c05f698eda5e\") " pod="openstack/dnsmasq-dns-86db49b7ff-c9btw" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.622148 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/115aa742-8de4-4cb2-84e2-c05f698eda5e-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-c9btw\" (UID: \"115aa742-8de4-4cb2-84e2-c05f698eda5e\") " pod="openstack/dnsmasq-dns-86db49b7ff-c9btw" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.622180 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/115aa742-8de4-4cb2-84e2-c05f698eda5e-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-c9btw\" (UID: \"115aa742-8de4-4cb2-84e2-c05f698eda5e\") " pod="openstack/dnsmasq-dns-86db49b7ff-c9btw" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.622209 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/115aa742-8de4-4cb2-84e2-c05f698eda5e-config\") pod \"dnsmasq-dns-86db49b7ff-c9btw\" (UID: \"115aa742-8de4-4cb2-84e2-c05f698eda5e\") " pod="openstack/dnsmasq-dns-86db49b7ff-c9btw" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.622284 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/115aa742-8de4-4cb2-84e2-c05f698eda5e-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-c9btw\" (UID: \"115aa742-8de4-4cb2-84e2-c05f698eda5e\") " pod="openstack/dnsmasq-dns-86db49b7ff-c9btw" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.623606 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/115aa742-8de4-4cb2-84e2-c05f698eda5e-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-c9btw\" (UID: \"115aa742-8de4-4cb2-84e2-c05f698eda5e\") " pod="openstack/dnsmasq-dns-86db49b7ff-c9btw" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.623792 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/115aa742-8de4-4cb2-84e2-c05f698eda5e-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-c9btw\" (UID: \"115aa742-8de4-4cb2-84e2-c05f698eda5e\") " pod="openstack/dnsmasq-dns-86db49b7ff-c9btw" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.624752 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/115aa742-8de4-4cb2-84e2-c05f698eda5e-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-c9btw\" (UID: \"115aa742-8de4-4cb2-84e2-c05f698eda5e\") " pod="openstack/dnsmasq-dns-86db49b7ff-c9btw" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.630484 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/115aa742-8de4-4cb2-84e2-c05f698eda5e-config\") pod \"dnsmasq-dns-86db49b7ff-c9btw\" (UID: \"115aa742-8de4-4cb2-84e2-c05f698eda5e\") " pod="openstack/dnsmasq-dns-86db49b7ff-c9btw" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.632791 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-t4slg" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.655271 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn9bj\" (UniqueName: \"kubernetes.io/projected/115aa742-8de4-4cb2-84e2-c05f698eda5e-kube-api-access-vn9bj\") pod \"dnsmasq-dns-86db49b7ff-c9btw\" (UID: \"115aa742-8de4-4cb2-84e2-c05f698eda5e\") " pod="openstack/dnsmasq-dns-86db49b7ff-c9btw" Dec 05 23:36:44 crc kubenswrapper[4734]: I1205 23:36:44.830460 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-c9btw" Dec 05 23:36:48 crc kubenswrapper[4734]: I1205 23:36:48.243701 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-rb2ss" Dec 05 23:36:48 crc kubenswrapper[4734]: I1205 23:36:48.253115 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-f684s" Dec 05 23:36:48 crc kubenswrapper[4734]: I1205 23:36:48.298672 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1826ecde-f68b-4010-bb28-aab705498c88-config\") pod \"1826ecde-f68b-4010-bb28-aab705498c88\" (UID: \"1826ecde-f68b-4010-bb28-aab705498c88\") " Dec 05 23:36:48 crc kubenswrapper[4734]: I1205 23:36:48.299548 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/355a394a-ee81-463e-8d82-b6c789ad6361-dns-svc\") pod \"355a394a-ee81-463e-8d82-b6c789ad6361\" (UID: \"355a394a-ee81-463e-8d82-b6c789ad6361\") " Dec 05 23:36:48 crc kubenswrapper[4734]: I1205 23:36:48.299580 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1826ecde-f68b-4010-bb28-aab705498c88-config" (OuterVolumeSpecName: "config") pod "1826ecde-f68b-4010-bb28-aab705498c88" (UID: "1826ecde-f68b-4010-bb28-aab705498c88"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:36:48 crc kubenswrapper[4734]: I1205 23:36:48.299939 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptd7t\" (UniqueName: \"kubernetes.io/projected/1826ecde-f68b-4010-bb28-aab705498c88-kube-api-access-ptd7t\") pod \"1826ecde-f68b-4010-bb28-aab705498c88\" (UID: \"1826ecde-f68b-4010-bb28-aab705498c88\") " Dec 05 23:36:48 crc kubenswrapper[4734]: I1205 23:36:48.299999 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1826ecde-f68b-4010-bb28-aab705498c88-dns-svc\") pod \"1826ecde-f68b-4010-bb28-aab705498c88\" (UID: \"1826ecde-f68b-4010-bb28-aab705498c88\") " Dec 05 23:36:48 crc kubenswrapper[4734]: I1205 23:36:48.300044 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-878pv\" (UniqueName: \"kubernetes.io/projected/355a394a-ee81-463e-8d82-b6c789ad6361-kube-api-access-878pv\") pod \"355a394a-ee81-463e-8d82-b6c789ad6361\" (UID: \"355a394a-ee81-463e-8d82-b6c789ad6361\") " Dec 05 23:36:48 crc kubenswrapper[4734]: I1205 23:36:48.300152 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/355a394a-ee81-463e-8d82-b6c789ad6361-config\") pod \"355a394a-ee81-463e-8d82-b6c789ad6361\" (UID: \"355a394a-ee81-463e-8d82-b6c789ad6361\") " Dec 05 23:36:48 crc kubenswrapper[4734]: I1205 23:36:48.300423 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/355a394a-ee81-463e-8d82-b6c789ad6361-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "355a394a-ee81-463e-8d82-b6c789ad6361" (UID: "355a394a-ee81-463e-8d82-b6c789ad6361"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:36:48 crc kubenswrapper[4734]: I1205 23:36:48.300728 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1826ecde-f68b-4010-bb28-aab705498c88-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1826ecde-f68b-4010-bb28-aab705498c88" (UID: "1826ecde-f68b-4010-bb28-aab705498c88"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:36:48 crc kubenswrapper[4734]: I1205 23:36:48.301006 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1826ecde-f68b-4010-bb28-aab705498c88-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:36:48 crc kubenswrapper[4734]: I1205 23:36:48.301043 4734 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/355a394a-ee81-463e-8d82-b6c789ad6361-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 23:36:48 crc kubenswrapper[4734]: I1205 23:36:48.301210 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/355a394a-ee81-463e-8d82-b6c789ad6361-config" (OuterVolumeSpecName: "config") pod "355a394a-ee81-463e-8d82-b6c789ad6361" (UID: "355a394a-ee81-463e-8d82-b6c789ad6361"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:36:48 crc kubenswrapper[4734]: I1205 23:36:48.311862 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1826ecde-f68b-4010-bb28-aab705498c88-kube-api-access-ptd7t" (OuterVolumeSpecName: "kube-api-access-ptd7t") pod "1826ecde-f68b-4010-bb28-aab705498c88" (UID: "1826ecde-f68b-4010-bb28-aab705498c88"). InnerVolumeSpecName "kube-api-access-ptd7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:36:48 crc kubenswrapper[4734]: I1205 23:36:48.312867 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/355a394a-ee81-463e-8d82-b6c789ad6361-kube-api-access-878pv" (OuterVolumeSpecName: "kube-api-access-878pv") pod "355a394a-ee81-463e-8d82-b6c789ad6361" (UID: "355a394a-ee81-463e-8d82-b6c789ad6361"). InnerVolumeSpecName "kube-api-access-878pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:36:48 crc kubenswrapper[4734]: I1205 23:36:48.403413 4734 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1826ecde-f68b-4010-bb28-aab705498c88-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 23:36:48 crc kubenswrapper[4734]: I1205 23:36:48.403476 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-878pv\" (UniqueName: \"kubernetes.io/projected/355a394a-ee81-463e-8d82-b6c789ad6361-kube-api-access-878pv\") on node \"crc\" DevicePath \"\"" Dec 05 23:36:48 crc kubenswrapper[4734]: I1205 23:36:48.403495 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/355a394a-ee81-463e-8d82-b6c789ad6361-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:36:48 crc kubenswrapper[4734]: I1205 23:36:48.403508 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptd7t\" (UniqueName: \"kubernetes.io/projected/1826ecde-f68b-4010-bb28-aab705498c88-kube-api-access-ptd7t\") on node \"crc\" DevicePath \"\"" Dec 05 23:36:49 crc kubenswrapper[4734]: I1205 23:36:49.061497 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-rb2ss" event={"ID":"1826ecde-f68b-4010-bb28-aab705498c88","Type":"ContainerDied","Data":"c8011d7672c5f37d2040665f0ae00fd061580906bebb4e189a1635ea5c0ee67c"} Dec 05 23:36:49 crc kubenswrapper[4734]: I1205 23:36:49.062192 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-rb2ss" Dec 05 23:36:49 crc kubenswrapper[4734]: I1205 23:36:49.064283 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-f684s" event={"ID":"355a394a-ee81-463e-8d82-b6c789ad6361","Type":"ContainerDied","Data":"9e01e898ad702db21ef593d30af98eb1d8ca97f049e0cf2fc8903d94851549bf"} Dec 05 23:36:49 crc kubenswrapper[4734]: I1205 23:36:49.064406 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-f684s" Dec 05 23:36:49 crc kubenswrapper[4734]: I1205 23:36:49.158655 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-rb2ss"] Dec 05 23:36:49 crc kubenswrapper[4734]: I1205 23:36:49.170822 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-rb2ss"] Dec 05 23:36:49 crc kubenswrapper[4734]: I1205 23:36:49.196077 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-f684s"] Dec 05 23:36:49 crc kubenswrapper[4734]: I1205 23:36:49.203309 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-f684s"] Dec 05 23:36:49 crc kubenswrapper[4734]: I1205 23:36:49.631086 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1826ecde-f68b-4010-bb28-aab705498c88" path="/var/lib/kubelet/pods/1826ecde-f68b-4010-bb28-aab705498c88/volumes" Dec 05 23:36:49 crc kubenswrapper[4734]: I1205 23:36:49.632992 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="355a394a-ee81-463e-8d82-b6c789ad6361" path="/var/lib/kubelet/pods/355a394a-ee81-463e-8d82-b6c789ad6361/volumes" Dec 05 23:36:49 crc kubenswrapper[4734]: I1205 23:36:49.888664 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-c9btw"] Dec 05 23:36:50 crc kubenswrapper[4734]: I1205 23:36:50.077251 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-cpzs4"] Dec 05 23:36:50 crc kubenswrapper[4734]: I1205 23:36:50.085235 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-t4slg"] Dec 05 23:36:51 crc kubenswrapper[4734]: I1205 23:36:51.085766 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"61801e1d-6a79-497f-822b-69b683c2f78b","Type":"ContainerStarted","Data":"a485b0bcb9d08c6a1245e7a31ef8fe22d5e95b98c41f62686520c3fbc7c7ef60"} Dec 05 23:36:51 crc kubenswrapper[4734]: I1205 23:36:51.086511 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 05 23:36:51 crc kubenswrapper[4734]: I1205 23:36:51.088062 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-cpzs4" event={"ID":"b246fed6-9a79-4d72-a73a-943b13d8e30b","Type":"ContainerStarted","Data":"edead927094adb1c9468f07f3845235c7f5d011bd7513f13ece5fe92870ef3d1"} Dec 05 23:36:51 crc kubenswrapper[4734]: I1205 23:36:51.097505 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tpdrq" event={"ID":"9631bcf5-05df-4e1d-b849-7352ef35013f","Type":"ContainerStarted","Data":"2085a25019b7ea2459def9a0c4b7da64c8e9cd6a1e18f44e7a12a0c4e9cc7e96"} Dec 05 23:36:51 crc kubenswrapper[4734]: I1205 23:36:51.102615 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4cf3a204-9b47-4206-9964-deb892777324","Type":"ContainerStarted","Data":"a50fc2d887cea8301fb9cb505ce53e804ed8f68a83f980a2b3e0fd30af384a43"} Dec 05 23:36:51 crc kubenswrapper[4734]: I1205 23:36:51.103346 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 05 23:36:51 crc kubenswrapper[4734]: I1205 23:36:51.107298 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-587wk" event={"ID":"625f2253-5867-4d61-a436-264a79c0bd94","Type":"ContainerStarted","Data":"48d59a6a345789623b51779c820f13d025f7b3daf96c88f9b78dbb248e96d3b3"} Dec 05 23:36:51 crc kubenswrapper[4734]: I1205 23:36:51.108137 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-587wk" Dec 05 23:36:51 crc kubenswrapper[4734]: I1205 23:36:51.114772 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=28.682796611 podStartE2EDuration="37.114752188s" podCreationTimestamp="2025-12-05 23:36:14 +0000 UTC" firstStartedPulling="2025-12-05 23:36:40.334815655 +0000 UTC m=+1021.018219931" lastFinishedPulling="2025-12-05 23:36:48.766771232 +0000 UTC m=+1029.450175508" observedRunningTime="2025-12-05 23:36:51.106719674 +0000 UTC m=+1031.790123950" watchObservedRunningTime="2025-12-05 23:36:51.114752188 +0000 UTC m=+1031.798156464" Dec 05 23:36:51 crc kubenswrapper[4734]: I1205 23:36:51.120444 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"3cc9e4dc-431f-4963-911b-f6262ac3c6b5","Type":"ContainerStarted","Data":"615ea66682ac3f7589f9927ac9524029ae6312075c9d6b01ab7e6a8c79505de4"} Dec 05 23:36:51 crc kubenswrapper[4734]: I1205 23:36:51.130903 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350","Type":"ContainerStarted","Data":"eb00da9d05923308d98461b940a2a0a18d8b558db69d98903ef94ccb6c7e35d4"} Dec 05 23:36:51 crc kubenswrapper[4734]: I1205 23:36:51.131242 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=26.205674737 podStartE2EDuration="36.131211747s" podCreationTimestamp="2025-12-05 23:36:15 +0000 UTC" firstStartedPulling="2025-12-05 23:36:40.614822457 +0000 UTC m=+1021.298226733" lastFinishedPulling="2025-12-05 23:36:50.540359467 +0000 UTC m=+1031.223763743" observedRunningTime="2025-12-05 23:36:51.124882374 +0000 UTC m=+1031.808286650" watchObservedRunningTime="2025-12-05 23:36:51.131211747 +0000 UTC m=+1031.814616023" Dec 05 23:36:51 crc kubenswrapper[4734]: I1205 23:36:51.135966 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3fceddd4-e096-4a7e-875f-756279962334","Type":"ContainerStarted","Data":"ebddb6aea37aa74eca328aa8fa57153a39aeaf39894bd2bf386bf9a21e0dac24"} Dec 05 23:36:51 crc kubenswrapper[4734]: I1205 23:36:51.139059 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9fd725c7-f12a-4504-a71d-46e7d0258af7","Type":"ContainerStarted","Data":"7f71935864e7539f13401a2b23fa03a0fc61bd480149c9381b84a292043162a7"} Dec 05 23:36:51 crc kubenswrapper[4734]: I1205 23:36:51.140942 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-c9btw" event={"ID":"115aa742-8de4-4cb2-84e2-c05f698eda5e","Type":"ContainerStarted","Data":"46761160ea59dc9ae2cd9b52e44fc6da30478bf985ed4e0b77110af8b99bf010"} Dec 05 23:36:51 crc kubenswrapper[4734]: I1205 23:36:51.141707 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-t4slg" event={"ID":"b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7","Type":"ContainerStarted","Data":"873a62bc6a31389e6ec8e0f3fe070b5e8480cc0d91709ba8e9e83f0c27d78b84"} Dec 05 23:36:51 crc kubenswrapper[4734]: I1205 23:36:51.233366 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-587wk" podStartSLOduration=21.685649541 podStartE2EDuration="31.23334375s" podCreationTimestamp="2025-12-05 23:36:20 +0000 UTC" firstStartedPulling="2025-12-05 23:36:40.861114892 +0000 UTC m=+1021.544519168" lastFinishedPulling="2025-12-05 23:36:50.408809101 +0000 UTC m=+1031.092213377" observedRunningTime="2025-12-05 23:36:51.226631798 +0000 UTC m=+1031.910036074" watchObservedRunningTime="2025-12-05 23:36:51.23334375 +0000 UTC m=+1031.916748036" Dec 05 23:36:52 crc kubenswrapper[4734]: I1205 23:36:52.158745 4734 generic.go:334] "Generic (PLEG): container finished" podID="9631bcf5-05df-4e1d-b849-7352ef35013f" containerID="2085a25019b7ea2459def9a0c4b7da64c8e9cd6a1e18f44e7a12a0c4e9cc7e96" exitCode=0 Dec 05 23:36:52 crc kubenswrapper[4734]: I1205 23:36:52.159510 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tpdrq" event={"ID":"9631bcf5-05df-4e1d-b849-7352ef35013f","Type":"ContainerDied","Data":"2085a25019b7ea2459def9a0c4b7da64c8e9cd6a1e18f44e7a12a0c4e9cc7e96"} Dec 05 23:36:52 crc kubenswrapper[4734]: I1205 23:36:52.163512 4734 generic.go:334] "Generic (PLEG): container finished" podID="b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7" containerID="278fcbbf13588cf29c089539619a0cd8216af855e376e080b334bc957ec8a4bb" exitCode=0 Dec 05 23:36:52 crc kubenswrapper[4734]: I1205 23:36:52.164047 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-t4slg" event={"ID":"b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7","Type":"ContainerDied","Data":"278fcbbf13588cf29c089539619a0cd8216af855e376e080b334bc957ec8a4bb"} Dec 05 23:36:52 crc kubenswrapper[4734]: I1205 23:36:52.168143 4734 generic.go:334] "Generic (PLEG): container finished" podID="115aa742-8de4-4cb2-84e2-c05f698eda5e" containerID="b240d5852fcb6123296ac2c6a6b3368ebcb90e5964721081cba3196355f06d85" exitCode=0 Dec 05 23:36:52 crc kubenswrapper[4734]: I1205 23:36:52.168198 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-c9btw" event={"ID":"115aa742-8de4-4cb2-84e2-c05f698eda5e","Type":"ContainerDied","Data":"b240d5852fcb6123296ac2c6a6b3368ebcb90e5964721081cba3196355f06d85"} Dec 05 23:36:53 crc kubenswrapper[4734]: I1205 23:36:53.188070 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-c9btw" event={"ID":"115aa742-8de4-4cb2-84e2-c05f698eda5e","Type":"ContainerStarted","Data":"77a7fadb2a77e908c44d03871ac6d355c03e2cdf50b1dd02851450d8cf30131d"} Dec 05 23:36:53 crc kubenswrapper[4734]: I1205 23:36:53.189126 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-c9btw" Dec 05 23:36:53 crc kubenswrapper[4734]: I1205 23:36:53.193857 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tpdrq" event={"ID":"9631bcf5-05df-4e1d-b849-7352ef35013f","Type":"ContainerStarted","Data":"3fbb60742fc27c995f6ce08d549e6cd11847207f36af1b5b36814c8cdb32552e"} Dec 05 23:36:53 crc kubenswrapper[4734]: I1205 23:36:53.193917 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tpdrq" event={"ID":"9631bcf5-05df-4e1d-b849-7352ef35013f","Type":"ContainerStarted","Data":"37468dc709ec624519a874f2ade83b2222e626e1806ab2f0363770702f7da7e9"} Dec 05 23:36:53 crc kubenswrapper[4734]: I1205 23:36:53.194171 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tpdrq" Dec 05 23:36:53 crc kubenswrapper[4734]: I1205 23:36:53.194286 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tpdrq" Dec 05 23:36:53 crc kubenswrapper[4734]: I1205 23:36:53.196760 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-t4slg" event={"ID":"b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7","Type":"ContainerStarted","Data":"5590994084ad1373c371f93d8799cb3f2bf8e2ae2b2d06f3c9d4ba161479a6c7"} Dec 05 23:36:53 crc kubenswrapper[4734]: I1205 23:36:53.197072 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-t4slg" Dec 05 23:36:53 crc kubenswrapper[4734]: I1205 23:36:53.213940 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-c9btw" podStartSLOduration=8.743865284 podStartE2EDuration="9.213917339s" podCreationTimestamp="2025-12-05 23:36:44 +0000 UTC" firstStartedPulling="2025-12-05 23:36:50.434741579 +0000 UTC m=+1031.118145855" lastFinishedPulling="2025-12-05 23:36:50.904793634 +0000 UTC m=+1031.588197910" observedRunningTime="2025-12-05 23:36:53.205623088 +0000 UTC m=+1033.889027374" watchObservedRunningTime="2025-12-05 23:36:53.213917339 +0000 UTC m=+1033.897321615" Dec 05 23:36:53 crc kubenswrapper[4734]: I1205 23:36:53.235638 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-t4slg" podStartSLOduration=8.787111542 podStartE2EDuration="9.235609404s" podCreationTimestamp="2025-12-05 23:36:44 +0000 UTC" firstStartedPulling="2025-12-05 23:36:50.451809552 +0000 UTC m=+1031.135213828" lastFinishedPulling="2025-12-05 23:36:50.900307414 +0000 UTC m=+1031.583711690" observedRunningTime="2025-12-05 23:36:53.2333635 +0000 UTC m=+1033.916767786" watchObservedRunningTime="2025-12-05 23:36:53.235609404 +0000 UTC m=+1033.919013680" Dec 05 23:36:53 crc kubenswrapper[4734]: I1205 23:36:53.258903 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-tpdrq" podStartSLOduration=25.190216042 podStartE2EDuration="33.258874768s" podCreationTimestamp="2025-12-05 23:36:20 +0000 UTC" firstStartedPulling="2025-12-05 23:36:41.007324714 +0000 UTC m=+1021.690728990" lastFinishedPulling="2025-12-05 23:36:49.07598344 +0000 UTC m=+1029.759387716" observedRunningTime="2025-12-05 23:36:53.252427701 +0000 UTC m=+1033.935831997" watchObservedRunningTime="2025-12-05 23:36:53.258874768 +0000 UTC m=+1033.942279054" Dec 05 23:36:55 crc kubenswrapper[4734]: I1205 23:36:55.220117 4734 generic.go:334] "Generic (PLEG): container finished" podID="3cc9e4dc-431f-4963-911b-f6262ac3c6b5" containerID="615ea66682ac3f7589f9927ac9524029ae6312075c9d6b01ab7e6a8c79505de4" exitCode=0 Dec 05 23:36:55 crc kubenswrapper[4734]: I1205 23:36:55.220228 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"3cc9e4dc-431f-4963-911b-f6262ac3c6b5","Type":"ContainerDied","Data":"615ea66682ac3f7589f9927ac9524029ae6312075c9d6b01ab7e6a8c79505de4"} Dec 05 23:36:55 crc kubenswrapper[4734]: I1205 23:36:55.224225 4734 generic.go:334] "Generic (PLEG): container finished" podID="9fd725c7-f12a-4504-a71d-46e7d0258af7" containerID="7f71935864e7539f13401a2b23fa03a0fc61bd480149c9381b84a292043162a7" exitCode=0 Dec 05 23:36:55 crc kubenswrapper[4734]: I1205 23:36:55.224279 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9fd725c7-f12a-4504-a71d-46e7d0258af7","Type":"ContainerDied","Data":"7f71935864e7539f13401a2b23fa03a0fc61bd480149c9381b84a292043162a7"} Dec 05 23:36:56 crc kubenswrapper[4734]: I1205 23:36:56.077933 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 05 23:36:56 crc kubenswrapper[4734]: I1205 23:36:56.239373 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"3cc9e4dc-431f-4963-911b-f6262ac3c6b5","Type":"ContainerStarted","Data":"779e9b3c03ca8abf4c919f6ac8ac413e8d2fba37370fc8b32297dbc032d0448b"} Dec 05 23:36:56 crc kubenswrapper[4734]: I1205 23:36:56.242302 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350","Type":"ContainerStarted","Data":"9214af11b949216d997237612eab628f27ee4d11c99c56441ec398a3dbd1f9eb"} Dec 05 23:36:56 crc kubenswrapper[4734]: I1205 23:36:56.247159 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9fd725c7-f12a-4504-a71d-46e7d0258af7","Type":"ContainerStarted","Data":"e3ff0e43aa4399564794ecf1b6239bad25ef62c9a4db565ea663ba8115969e66"} Dec 05 23:36:56 crc kubenswrapper[4734]: I1205 23:36:56.248953 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-cpzs4" event={"ID":"b246fed6-9a79-4d72-a73a-943b13d8e30b","Type":"ContainerStarted","Data":"a2c0b83afaf67e6bbfe179d626bc6be002631734930c43aaf3cff115ec2ca32b"} Dec 05 23:36:56 crc kubenswrapper[4734]: I1205 23:36:56.251563 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3fceddd4-e096-4a7e-875f-756279962334","Type":"ContainerStarted","Data":"4bde1d5ee2f86b69363fa7d9846be367e2eb40e1f737d66e745fb9c6032f596b"} Dec 05 23:36:56 crc kubenswrapper[4734]: I1205 23:36:56.274176 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=35.697235469 podStartE2EDuration="44.274144786s" podCreationTimestamp="2025-12-05 23:36:12 +0000 UTC" firstStartedPulling="2025-12-05 23:36:40.89777418 +0000 UTC m=+1021.581178446" lastFinishedPulling="2025-12-05 23:36:49.474683487 +0000 UTC m=+1030.158087763" observedRunningTime="2025-12-05 23:36:56.269931844 +0000 UTC m=+1036.953336120" watchObservedRunningTime="2025-12-05 23:36:56.274144786 +0000 UTC m=+1036.957549082" Dec 05 23:36:56 crc kubenswrapper[4734]: I1205 23:36:56.294623 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=21.185164649 podStartE2EDuration="34.294594002s" podCreationTimestamp="2025-12-05 23:36:22 +0000 UTC" firstStartedPulling="2025-12-05 23:36:41.585015524 +0000 UTC m=+1022.268419800" lastFinishedPulling="2025-12-05 23:36:54.694444867 +0000 UTC m=+1035.377849153" observedRunningTime="2025-12-05 23:36:56.291439896 +0000 UTC m=+1036.974844172" watchObservedRunningTime="2025-12-05 23:36:56.294594002 +0000 UTC m=+1036.977998288" Dec 05 23:36:56 crc kubenswrapper[4734]: I1205 23:36:56.315429 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=36.852074531 podStartE2EDuration="45.315400086s" podCreationTimestamp="2025-12-05 23:36:11 +0000 UTC" firstStartedPulling="2025-12-05 23:36:40.612471871 +0000 UTC m=+1021.295876157" lastFinishedPulling="2025-12-05 23:36:49.075797426 +0000 UTC m=+1029.759201712" observedRunningTime="2025-12-05 23:36:56.311144163 +0000 UTC m=+1036.994548439" watchObservedRunningTime="2025-12-05 23:36:56.315400086 +0000 UTC m=+1036.998804362" Dec 05 23:36:56 crc kubenswrapper[4734]: I1205 23:36:56.337923 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-cpzs4" podStartSLOduration=8.980808722999999 podStartE2EDuration="13.337895271s" podCreationTimestamp="2025-12-05 23:36:43 +0000 UTC" firstStartedPulling="2025-12-05 23:36:50.434750989 +0000 UTC m=+1031.118155265" lastFinishedPulling="2025-12-05 23:36:54.791837537 +0000 UTC m=+1035.475241813" observedRunningTime="2025-12-05 23:36:56.33415567 +0000 UTC m=+1037.017559976" watchObservedRunningTime="2025-12-05 23:36:56.337895271 +0000 UTC m=+1037.021299547" Dec 05 23:36:56 crc kubenswrapper[4734]: I1205 23:36:56.764743 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:56 crc kubenswrapper[4734]: I1205 23:36:56.804676 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:56 crc kubenswrapper[4734]: I1205 23:36:56.829764 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=24.927033437 podStartE2EDuration="38.829732842s" podCreationTimestamp="2025-12-05 23:36:18 +0000 UTC" firstStartedPulling="2025-12-05 23:36:40.81312744 +0000 UTC m=+1021.496531716" lastFinishedPulling="2025-12-05 23:36:54.715826845 +0000 UTC m=+1035.399231121" observedRunningTime="2025-12-05 23:36:56.361078022 +0000 UTC m=+1037.044482298" watchObservedRunningTime="2025-12-05 23:36:56.829732842 +0000 UTC m=+1037.513137118" Dec 05 23:36:57 crc kubenswrapper[4734]: I1205 23:36:57.259898 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:57 crc kubenswrapper[4734]: I1205 23:36:57.318080 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.181788 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.232953 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.283910 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.323314 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.389018 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.546037 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.548116 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.554271 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.554279 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-d557q" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.555518 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.573643 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.582477 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.647749 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-t4slg" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.665540 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bf6b4283-12e2-489b-9808-9b4f21a2c080-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"bf6b4283-12e2-489b-9808-9b4f21a2c080\") " pod="openstack/ovn-northd-0" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.665624 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf6b4283-12e2-489b-9808-9b4f21a2c080-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"bf6b4283-12e2-489b-9808-9b4f21a2c080\") " pod="openstack/ovn-northd-0" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.665659 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf6b4283-12e2-489b-9808-9b4f21a2c080-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"bf6b4283-12e2-489b-9808-9b4f21a2c080\") " pod="openstack/ovn-northd-0" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.665719 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rghfp\" (UniqueName: \"kubernetes.io/projected/bf6b4283-12e2-489b-9808-9b4f21a2c080-kube-api-access-rghfp\") pod \"ovn-northd-0\" (UID: \"bf6b4283-12e2-489b-9808-9b4f21a2c080\") " pod="openstack/ovn-northd-0" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.665749 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf6b4283-12e2-489b-9808-9b4f21a2c080-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"bf6b4283-12e2-489b-9808-9b4f21a2c080\") " pod="openstack/ovn-northd-0" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.665797 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf6b4283-12e2-489b-9808-9b4f21a2c080-config\") pod \"ovn-northd-0\" (UID: \"bf6b4283-12e2-489b-9808-9b4f21a2c080\") " pod="openstack/ovn-northd-0" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.665849 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf6b4283-12e2-489b-9808-9b4f21a2c080-scripts\") pod \"ovn-northd-0\" (UID: \"bf6b4283-12e2-489b-9808-9b4f21a2c080\") " pod="openstack/ovn-northd-0" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.767800 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf6b4283-12e2-489b-9808-9b4f21a2c080-config\") pod \"ovn-northd-0\" (UID: \"bf6b4283-12e2-489b-9808-9b4f21a2c080\") " pod="openstack/ovn-northd-0" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.767900 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf6b4283-12e2-489b-9808-9b4f21a2c080-scripts\") pod \"ovn-northd-0\" (UID: \"bf6b4283-12e2-489b-9808-9b4f21a2c080\") " pod="openstack/ovn-northd-0" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.767961 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bf6b4283-12e2-489b-9808-9b4f21a2c080-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"bf6b4283-12e2-489b-9808-9b4f21a2c080\") " pod="openstack/ovn-northd-0" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.767996 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf6b4283-12e2-489b-9808-9b4f21a2c080-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"bf6b4283-12e2-489b-9808-9b4f21a2c080\") " pod="openstack/ovn-northd-0" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.768016 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf6b4283-12e2-489b-9808-9b4f21a2c080-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"bf6b4283-12e2-489b-9808-9b4f21a2c080\") " pod="openstack/ovn-northd-0" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.768054 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rghfp\" (UniqueName: \"kubernetes.io/projected/bf6b4283-12e2-489b-9808-9b4f21a2c080-kube-api-access-rghfp\") pod \"ovn-northd-0\" (UID: \"bf6b4283-12e2-489b-9808-9b4f21a2c080\") " pod="openstack/ovn-northd-0" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.768072 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf6b4283-12e2-489b-9808-9b4f21a2c080-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"bf6b4283-12e2-489b-9808-9b4f21a2c080\") " pod="openstack/ovn-northd-0" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.768764 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf6b4283-12e2-489b-9808-9b4f21a2c080-config\") pod \"ovn-northd-0\" (UID: \"bf6b4283-12e2-489b-9808-9b4f21a2c080\") " pod="openstack/ovn-northd-0" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.769210 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf6b4283-12e2-489b-9808-9b4f21a2c080-scripts\") pod \"ovn-northd-0\" (UID: \"bf6b4283-12e2-489b-9808-9b4f21a2c080\") " pod="openstack/ovn-northd-0" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.770902 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bf6b4283-12e2-489b-9808-9b4f21a2c080-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"bf6b4283-12e2-489b-9808-9b4f21a2c080\") " pod="openstack/ovn-northd-0" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.777426 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf6b4283-12e2-489b-9808-9b4f21a2c080-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"bf6b4283-12e2-489b-9808-9b4f21a2c080\") " pod="openstack/ovn-northd-0" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.778043 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf6b4283-12e2-489b-9808-9b4f21a2c080-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"bf6b4283-12e2-489b-9808-9b4f21a2c080\") " pod="openstack/ovn-northd-0" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.780183 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf6b4283-12e2-489b-9808-9b4f21a2c080-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"bf6b4283-12e2-489b-9808-9b4f21a2c080\") " pod="openstack/ovn-northd-0" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.796888 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rghfp\" (UniqueName: \"kubernetes.io/projected/bf6b4283-12e2-489b-9808-9b4f21a2c080-kube-api-access-rghfp\") pod \"ovn-northd-0\" (UID: \"bf6b4283-12e2-489b-9808-9b4f21a2c080\") " pod="openstack/ovn-northd-0" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.832728 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-c9btw" Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.904873 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-t4slg"] Dec 05 23:36:59 crc kubenswrapper[4734]: I1205 23:36:59.910477 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 05 23:37:00 crc kubenswrapper[4734]: I1205 23:37:00.291483 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-t4slg" podUID="b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7" containerName="dnsmasq-dns" containerID="cri-o://5590994084ad1373c371f93d8799cb3f2bf8e2ae2b2d06f3c9d4ba161479a6c7" gracePeriod=10 Dec 05 23:37:00 crc kubenswrapper[4734]: I1205 23:37:00.430517 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 23:37:00 crc kubenswrapper[4734]: W1205 23:37:00.437210 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf6b4283_12e2_489b_9808_9b4f21a2c080.slice/crio-b021b96a8f14c72457412a1315a1b9716bca6164e7e71623c9b2ada4dbd29edc WatchSource:0}: Error finding container b021b96a8f14c72457412a1315a1b9716bca6164e7e71623c9b2ada4dbd29edc: Status 404 returned error can't find the container with id b021b96a8f14c72457412a1315a1b9716bca6164e7e71623c9b2ada4dbd29edc Dec 05 23:37:01 crc kubenswrapper[4734]: I1205 23:37:01.305129 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"bf6b4283-12e2-489b-9808-9b4f21a2c080","Type":"ContainerStarted","Data":"b021b96a8f14c72457412a1315a1b9716bca6164e7e71623c9b2ada4dbd29edc"} Dec 05 23:37:01 crc kubenswrapper[4734]: I1205 23:37:01.309659 4734 generic.go:334] "Generic (PLEG): container finished" podID="b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7" containerID="5590994084ad1373c371f93d8799cb3f2bf8e2ae2b2d06f3c9d4ba161479a6c7" exitCode=0 Dec 05 23:37:01 crc kubenswrapper[4734]: I1205 23:37:01.309736 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-t4slg" event={"ID":"b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7","Type":"ContainerDied","Data":"5590994084ad1373c371f93d8799cb3f2bf8e2ae2b2d06f3c9d4ba161479a6c7"} Dec 05 23:37:01 crc kubenswrapper[4734]: I1205 23:37:01.424298 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-t4slg" Dec 05 23:37:01 crc kubenswrapper[4734]: I1205 23:37:01.503146 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7-config\") pod \"b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7\" (UID: \"b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7\") " Dec 05 23:37:01 crc kubenswrapper[4734]: I1205 23:37:01.503274 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5bj2\" (UniqueName: \"kubernetes.io/projected/b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7-kube-api-access-r5bj2\") pod \"b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7\" (UID: \"b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7\") " Dec 05 23:37:01 crc kubenswrapper[4734]: I1205 23:37:01.503320 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7-ovsdbserver-nb\") pod \"b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7\" (UID: \"b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7\") " Dec 05 23:37:01 crc kubenswrapper[4734]: I1205 23:37:01.503342 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7-dns-svc\") pod \"b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7\" (UID: \"b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7\") " Dec 05 23:37:01 crc kubenswrapper[4734]: I1205 23:37:01.511886 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7-kube-api-access-r5bj2" (OuterVolumeSpecName: "kube-api-access-r5bj2") pod "b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7" (UID: "b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7"). InnerVolumeSpecName "kube-api-access-r5bj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:37:01 crc kubenswrapper[4734]: I1205 23:37:01.568054 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7" (UID: "b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:37:01 crc kubenswrapper[4734]: I1205 23:37:01.568097 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7" (UID: "b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:37:01 crc kubenswrapper[4734]: I1205 23:37:01.570404 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7-config" (OuterVolumeSpecName: "config") pod "b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7" (UID: "b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:37:01 crc kubenswrapper[4734]: I1205 23:37:01.606924 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:01 crc kubenswrapper[4734]: I1205 23:37:01.606981 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5bj2\" (UniqueName: \"kubernetes.io/projected/b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7-kube-api-access-r5bj2\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:01 crc kubenswrapper[4734]: I1205 23:37:01.606994 4734 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:01 crc kubenswrapper[4734]: I1205 23:37:01.607005 4734 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:02 crc kubenswrapper[4734]: I1205 23:37:02.319839 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"bf6b4283-12e2-489b-9808-9b4f21a2c080","Type":"ContainerStarted","Data":"b5c343bfcbefce424f2132a97e83e41f623b9281c6dbdafbb8a4a763729880ae"} Dec 05 23:37:02 crc kubenswrapper[4734]: I1205 23:37:02.321479 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 05 23:37:02 crc kubenswrapper[4734]: I1205 23:37:02.321616 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"bf6b4283-12e2-489b-9808-9b4f21a2c080","Type":"ContainerStarted","Data":"e81445daeed9439e3906267e9ad317bf811ec3a227ce559a3c74f7028a181093"} Dec 05 23:37:02 crc kubenswrapper[4734]: I1205 23:37:02.322128 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-t4slg" event={"ID":"b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7","Type":"ContainerDied","Data":"873a62bc6a31389e6ec8e0f3fe070b5e8480cc0d91709ba8e9e83f0c27d78b84"} Dec 05 23:37:02 crc kubenswrapper[4734]: I1205 23:37:02.322199 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-t4slg" Dec 05 23:37:02 crc kubenswrapper[4734]: I1205 23:37:02.322244 4734 scope.go:117] "RemoveContainer" containerID="5590994084ad1373c371f93d8799cb3f2bf8e2ae2b2d06f3c9d4ba161479a6c7" Dec 05 23:37:02 crc kubenswrapper[4734]: I1205 23:37:02.339832 4734 scope.go:117] "RemoveContainer" containerID="278fcbbf13588cf29c089539619a0cd8216af855e376e080b334bc957ec8a4bb" Dec 05 23:37:02 crc kubenswrapper[4734]: I1205 23:37:02.344118 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.057355663 podStartE2EDuration="3.344095987s" podCreationTimestamp="2025-12-05 23:36:59 +0000 UTC" firstStartedPulling="2025-12-05 23:37:00.442081421 +0000 UTC m=+1041.125485697" lastFinishedPulling="2025-12-05 23:37:01.728821745 +0000 UTC m=+1042.412226021" observedRunningTime="2025-12-05 23:37:02.341965965 +0000 UTC m=+1043.025370241" watchObservedRunningTime="2025-12-05 23:37:02.344095987 +0000 UTC m=+1043.027500263" Dec 05 23:37:02 crc kubenswrapper[4734]: I1205 23:37:02.365062 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-t4slg"] Dec 05 23:37:02 crc kubenswrapper[4734]: I1205 23:37:02.372173 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-t4slg"] Dec 05 23:37:02 crc kubenswrapper[4734]: I1205 23:37:02.794135 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 05 23:37:02 crc kubenswrapper[4734]: I1205 23:37:02.794232 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 05 23:37:02 crc kubenswrapper[4734]: I1205 23:37:02.908796 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 05 23:37:03 crc kubenswrapper[4734]: I1205 23:37:03.467501 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 05 23:37:03 crc kubenswrapper[4734]: I1205 23:37:03.625339 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7" path="/var/lib/kubelet/pods/b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7/volumes" Dec 05 23:37:03 crc kubenswrapper[4734]: I1205 23:37:03.907899 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-xm6wd"] Dec 05 23:37:03 crc kubenswrapper[4734]: E1205 23:37:03.908314 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7" containerName="dnsmasq-dns" Dec 05 23:37:03 crc kubenswrapper[4734]: I1205 23:37:03.908335 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7" containerName="dnsmasq-dns" Dec 05 23:37:03 crc kubenswrapper[4734]: E1205 23:37:03.908352 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7" containerName="init" Dec 05 23:37:03 crc kubenswrapper[4734]: I1205 23:37:03.908361 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7" containerName="init" Dec 05 23:37:03 crc kubenswrapper[4734]: I1205 23:37:03.908577 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8b60174-d2b1-4ba3-acb0-3d8f2756d4e7" containerName="dnsmasq-dns" Dec 05 23:37:03 crc kubenswrapper[4734]: I1205 23:37:03.909418 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xm6wd" Dec 05 23:37:03 crc kubenswrapper[4734]: I1205 23:37:03.939345 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-xm6wd"] Dec 05 23:37:03 crc kubenswrapper[4734]: I1205 23:37:03.947012 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-986h4\" (UniqueName: \"kubernetes.io/projected/3d76d646-48a4-405f-ba5e-fa7ef1775294-kube-api-access-986h4\") pod \"keystone-db-create-xm6wd\" (UID: \"3d76d646-48a4-405f-ba5e-fa7ef1775294\") " pod="openstack/keystone-db-create-xm6wd" Dec 05 23:37:03 crc kubenswrapper[4734]: I1205 23:37:03.947069 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d76d646-48a4-405f-ba5e-fa7ef1775294-operator-scripts\") pod \"keystone-db-create-xm6wd\" (UID: \"3d76d646-48a4-405f-ba5e-fa7ef1775294\") " pod="openstack/keystone-db-create-xm6wd" Dec 05 23:37:03 crc kubenswrapper[4734]: I1205 23:37:03.973235 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-d0b4-account-create-update-qssx2"] Dec 05 23:37:03 crc kubenswrapper[4734]: I1205 23:37:03.974612 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d0b4-account-create-update-qssx2" Dec 05 23:37:03 crc kubenswrapper[4734]: I1205 23:37:03.977326 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 05 23:37:03 crc kubenswrapper[4734]: I1205 23:37:03.981997 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d0b4-account-create-update-qssx2"] Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.048502 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c28ce93a-e28b-4be0-87bd-38e5dc1383df-operator-scripts\") pod \"keystone-d0b4-account-create-update-qssx2\" (UID: \"c28ce93a-e28b-4be0-87bd-38e5dc1383df\") " pod="openstack/keystone-d0b4-account-create-update-qssx2" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.048618 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s4tw\" (UniqueName: \"kubernetes.io/projected/c28ce93a-e28b-4be0-87bd-38e5dc1383df-kube-api-access-2s4tw\") pod \"keystone-d0b4-account-create-update-qssx2\" (UID: \"c28ce93a-e28b-4be0-87bd-38e5dc1383df\") " pod="openstack/keystone-d0b4-account-create-update-qssx2" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.048662 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-986h4\" (UniqueName: \"kubernetes.io/projected/3d76d646-48a4-405f-ba5e-fa7ef1775294-kube-api-access-986h4\") pod \"keystone-db-create-xm6wd\" (UID: \"3d76d646-48a4-405f-ba5e-fa7ef1775294\") " pod="openstack/keystone-db-create-xm6wd" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.048687 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d76d646-48a4-405f-ba5e-fa7ef1775294-operator-scripts\") pod \"keystone-db-create-xm6wd\" (UID: \"3d76d646-48a4-405f-ba5e-fa7ef1775294\") " pod="openstack/keystone-db-create-xm6wd" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.049656 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d76d646-48a4-405f-ba5e-fa7ef1775294-operator-scripts\") pod \"keystone-db-create-xm6wd\" (UID: \"3d76d646-48a4-405f-ba5e-fa7ef1775294\") " pod="openstack/keystone-db-create-xm6wd" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.071464 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-986h4\" (UniqueName: \"kubernetes.io/projected/3d76d646-48a4-405f-ba5e-fa7ef1775294-kube-api-access-986h4\") pod \"keystone-db-create-xm6wd\" (UID: \"3d76d646-48a4-405f-ba5e-fa7ef1775294\") " pod="openstack/keystone-db-create-xm6wd" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.150869 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c28ce93a-e28b-4be0-87bd-38e5dc1383df-operator-scripts\") pod \"keystone-d0b4-account-create-update-qssx2\" (UID: \"c28ce93a-e28b-4be0-87bd-38e5dc1383df\") " pod="openstack/keystone-d0b4-account-create-update-qssx2" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.150957 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s4tw\" (UniqueName: \"kubernetes.io/projected/c28ce93a-e28b-4be0-87bd-38e5dc1383df-kube-api-access-2s4tw\") pod \"keystone-d0b4-account-create-update-qssx2\" (UID: \"c28ce93a-e28b-4be0-87bd-38e5dc1383df\") " pod="openstack/keystone-d0b4-account-create-update-qssx2" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.151828 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c28ce93a-e28b-4be0-87bd-38e5dc1383df-operator-scripts\") pod \"keystone-d0b4-account-create-update-qssx2\" (UID: \"c28ce93a-e28b-4be0-87bd-38e5dc1383df\") " pod="openstack/keystone-d0b4-account-create-update-qssx2" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.167797 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s4tw\" (UniqueName: \"kubernetes.io/projected/c28ce93a-e28b-4be0-87bd-38e5dc1383df-kube-api-access-2s4tw\") pod \"keystone-d0b4-account-create-update-qssx2\" (UID: \"c28ce93a-e28b-4be0-87bd-38e5dc1383df\") " pod="openstack/keystone-d0b4-account-create-update-qssx2" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.203938 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.204008 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.230957 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xm6wd" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.305461 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-n8dcb"] Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.310027 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-n8dcb" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.312283 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d0b4-account-create-update-qssx2" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.320764 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-n8dcb"] Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.326114 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.360811 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqm22\" (UniqueName: \"kubernetes.io/projected/f0266b9f-86d4-462a-a20f-8897e48bfa43-kube-api-access-tqm22\") pod \"placement-db-create-n8dcb\" (UID: \"f0266b9f-86d4-462a-a20f-8897e48bfa43\") " pod="openstack/placement-db-create-n8dcb" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.360888 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0266b9f-86d4-462a-a20f-8897e48bfa43-operator-scripts\") pod \"placement-db-create-n8dcb\" (UID: \"f0266b9f-86d4-462a-a20f-8897e48bfa43\") " pod="openstack/placement-db-create-n8dcb" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.438600 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8463-account-create-update-psrz9"] Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.440833 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8463-account-create-update-psrz9" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.446230 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.455536 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8463-account-create-update-psrz9"] Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.466190 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqm22\" (UniqueName: \"kubernetes.io/projected/f0266b9f-86d4-462a-a20f-8897e48bfa43-kube-api-access-tqm22\") pod \"placement-db-create-n8dcb\" (UID: \"f0266b9f-86d4-462a-a20f-8897e48bfa43\") " pod="openstack/placement-db-create-n8dcb" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.466307 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0266b9f-86d4-462a-a20f-8897e48bfa43-operator-scripts\") pod \"placement-db-create-n8dcb\" (UID: \"f0266b9f-86d4-462a-a20f-8897e48bfa43\") " pod="openstack/placement-db-create-n8dcb" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.468858 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0266b9f-86d4-462a-a20f-8897e48bfa43-operator-scripts\") pod \"placement-db-create-n8dcb\" (UID: \"f0266b9f-86d4-462a-a20f-8897e48bfa43\") " pod="openstack/placement-db-create-n8dcb" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.482278 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.490310 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqm22\" (UniqueName: \"kubernetes.io/projected/f0266b9f-86d4-462a-a20f-8897e48bfa43-kube-api-access-tqm22\") pod \"placement-db-create-n8dcb\" (UID: \"f0266b9f-86d4-462a-a20f-8897e48bfa43\") " pod="openstack/placement-db-create-n8dcb" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.569007 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40a48ff8-8c86-4d37-8962-bb74618c2558-operator-scripts\") pod \"placement-8463-account-create-update-psrz9\" (UID: \"40a48ff8-8c86-4d37-8962-bb74618c2558\") " pod="openstack/placement-8463-account-create-update-psrz9" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.569410 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpg6z\" (UniqueName: \"kubernetes.io/projected/40a48ff8-8c86-4d37-8962-bb74618c2558-kube-api-access-kpg6z\") pod \"placement-8463-account-create-update-psrz9\" (UID: \"40a48ff8-8c86-4d37-8962-bb74618c2558\") " pod="openstack/placement-8463-account-create-update-psrz9" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.671656 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40a48ff8-8c86-4d37-8962-bb74618c2558-operator-scripts\") pod \"placement-8463-account-create-update-psrz9\" (UID: \"40a48ff8-8c86-4d37-8962-bb74618c2558\") " pod="openstack/placement-8463-account-create-update-psrz9" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.671815 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpg6z\" (UniqueName: \"kubernetes.io/projected/40a48ff8-8c86-4d37-8962-bb74618c2558-kube-api-access-kpg6z\") pod \"placement-8463-account-create-update-psrz9\" (UID: \"40a48ff8-8c86-4d37-8962-bb74618c2558\") " pod="openstack/placement-8463-account-create-update-psrz9" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.673316 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40a48ff8-8c86-4d37-8962-bb74618c2558-operator-scripts\") pod \"placement-8463-account-create-update-psrz9\" (UID: \"40a48ff8-8c86-4d37-8962-bb74618c2558\") " pod="openstack/placement-8463-account-create-update-psrz9" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.695722 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-n8dcb" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.698570 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpg6z\" (UniqueName: \"kubernetes.io/projected/40a48ff8-8c86-4d37-8962-bb74618c2558-kube-api-access-kpg6z\") pod \"placement-8463-account-create-update-psrz9\" (UID: \"40a48ff8-8c86-4d37-8962-bb74618c2558\") " pod="openstack/placement-8463-account-create-update-psrz9" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.772692 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8463-account-create-update-psrz9" Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.881152 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-xm6wd"] Dec 05 23:37:04 crc kubenswrapper[4734]: I1205 23:37:04.981014 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d0b4-account-create-update-qssx2"] Dec 05 23:37:04 crc kubenswrapper[4734]: W1205 23:37:04.991731 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc28ce93a_e28b_4be0_87bd_38e5dc1383df.slice/crio-99b68f4124008aeecdffb3d8201a58c2948e277850ec891bb6e7db77bb878dc3 WatchSource:0}: Error finding container 99b68f4124008aeecdffb3d8201a58c2948e277850ec891bb6e7db77bb878dc3: Status 404 returned error can't find the container with id 99b68f4124008aeecdffb3d8201a58c2948e277850ec891bb6e7db77bb878dc3 Dec 05 23:37:05 crc kubenswrapper[4734]: I1205 23:37:05.143517 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-n8dcb"] Dec 05 23:37:05 crc kubenswrapper[4734]: I1205 23:37:05.265886 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8463-account-create-update-psrz9"] Dec 05 23:37:05 crc kubenswrapper[4734]: W1205 23:37:05.282152 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40a48ff8_8c86_4d37_8962_bb74618c2558.slice/crio-f24e000e1656242c3aeedb68fb7e261acf111fd6c0f2e36e3d18e16eb42e9034 WatchSource:0}: Error finding container f24e000e1656242c3aeedb68fb7e261acf111fd6c0f2e36e3d18e16eb42e9034: Status 404 returned error can't find the container with id f24e000e1656242c3aeedb68fb7e261acf111fd6c0f2e36e3d18e16eb42e9034 Dec 05 23:37:05 crc kubenswrapper[4734]: I1205 23:37:05.355374 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d0b4-account-create-update-qssx2" event={"ID":"c28ce93a-e28b-4be0-87bd-38e5dc1383df","Type":"ContainerStarted","Data":"99b68f4124008aeecdffb3d8201a58c2948e277850ec891bb6e7db77bb878dc3"} Dec 05 23:37:05 crc kubenswrapper[4734]: I1205 23:37:05.357177 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-n8dcb" event={"ID":"f0266b9f-86d4-462a-a20f-8897e48bfa43","Type":"ContainerStarted","Data":"a5a62de4c5f33235629c82b7c070df100b3820f5990533b894a942da10a4a028"} Dec 05 23:37:05 crc kubenswrapper[4734]: I1205 23:37:05.359114 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8463-account-create-update-psrz9" event={"ID":"40a48ff8-8c86-4d37-8962-bb74618c2558","Type":"ContainerStarted","Data":"f24e000e1656242c3aeedb68fb7e261acf111fd6c0f2e36e3d18e16eb42e9034"} Dec 05 23:37:05 crc kubenswrapper[4734]: I1205 23:37:05.364383 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xm6wd" event={"ID":"3d76d646-48a4-405f-ba5e-fa7ef1775294","Type":"ContainerStarted","Data":"9b4adfba9cb2e0ec28fc8f79440f61f1e28013f2db74580cc72374cbd7221c7e"} Dec 05 23:37:05 crc kubenswrapper[4734]: I1205 23:37:05.364429 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xm6wd" event={"ID":"3d76d646-48a4-405f-ba5e-fa7ef1775294","Type":"ContainerStarted","Data":"03f8753325fbeddd6665368095188b20dbf055c070838b4e1b601dac8931f5f1"} Dec 05 23:37:05 crc kubenswrapper[4734]: I1205 23:37:05.388559 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-xm6wd" podStartSLOduration=2.3884959 podStartE2EDuration="2.3884959s" podCreationTimestamp="2025-12-05 23:37:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:37:05.378428927 +0000 UTC m=+1046.061833203" watchObservedRunningTime="2025-12-05 23:37:05.3884959 +0000 UTC m=+1046.071900196" Dec 05 23:37:06 crc kubenswrapper[4734]: I1205 23:37:06.022463 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-l7vfg"] Dec 05 23:37:06 crc kubenswrapper[4734]: I1205 23:37:06.033991 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-l7vfg" Dec 05 23:37:06 crc kubenswrapper[4734]: I1205 23:37:06.049241 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-l7vfg"] Dec 05 23:37:06 crc kubenswrapper[4734]: I1205 23:37:06.201309 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-l7vfg\" (UID: \"e0c9ffcb-625f-49f8-869a-e71e5f53b92b\") " pod="openstack/dnsmasq-dns-698758b865-l7vfg" Dec 05 23:37:06 crc kubenswrapper[4734]: I1205 23:37:06.201374 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c79lm\" (UniqueName: \"kubernetes.io/projected/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-kube-api-access-c79lm\") pod \"dnsmasq-dns-698758b865-l7vfg\" (UID: \"e0c9ffcb-625f-49f8-869a-e71e5f53b92b\") " pod="openstack/dnsmasq-dns-698758b865-l7vfg" Dec 05 23:37:06 crc kubenswrapper[4734]: I1205 23:37:06.201700 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-l7vfg\" (UID: \"e0c9ffcb-625f-49f8-869a-e71e5f53b92b\") " pod="openstack/dnsmasq-dns-698758b865-l7vfg" Dec 05 23:37:06 crc kubenswrapper[4734]: I1205 23:37:06.201880 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-dns-svc\") pod \"dnsmasq-dns-698758b865-l7vfg\" (UID: \"e0c9ffcb-625f-49f8-869a-e71e5f53b92b\") " pod="openstack/dnsmasq-dns-698758b865-l7vfg" Dec 05 23:37:06 crc kubenswrapper[4734]: I1205 23:37:06.202205 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-config\") pod \"dnsmasq-dns-698758b865-l7vfg\" (UID: \"e0c9ffcb-625f-49f8-869a-e71e5f53b92b\") " pod="openstack/dnsmasq-dns-698758b865-l7vfg" Dec 05 23:37:06 crc kubenswrapper[4734]: I1205 23:37:06.304071 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-l7vfg\" (UID: \"e0c9ffcb-625f-49f8-869a-e71e5f53b92b\") " pod="openstack/dnsmasq-dns-698758b865-l7vfg" Dec 05 23:37:06 crc kubenswrapper[4734]: I1205 23:37:06.304136 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-dns-svc\") pod \"dnsmasq-dns-698758b865-l7vfg\" (UID: \"e0c9ffcb-625f-49f8-869a-e71e5f53b92b\") " pod="openstack/dnsmasq-dns-698758b865-l7vfg" Dec 05 23:37:06 crc kubenswrapper[4734]: I1205 23:37:06.304204 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-config\") pod \"dnsmasq-dns-698758b865-l7vfg\" (UID: \"e0c9ffcb-625f-49f8-869a-e71e5f53b92b\") " pod="openstack/dnsmasq-dns-698758b865-l7vfg" Dec 05 23:37:06 crc kubenswrapper[4734]: I1205 23:37:06.304224 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-l7vfg\" (UID: \"e0c9ffcb-625f-49f8-869a-e71e5f53b92b\") " pod="openstack/dnsmasq-dns-698758b865-l7vfg" Dec 05 23:37:06 crc kubenswrapper[4734]: I1205 23:37:06.304247 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c79lm\" (UniqueName: \"kubernetes.io/projected/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-kube-api-access-c79lm\") pod \"dnsmasq-dns-698758b865-l7vfg\" (UID: \"e0c9ffcb-625f-49f8-869a-e71e5f53b92b\") " pod="openstack/dnsmasq-dns-698758b865-l7vfg" Dec 05 23:37:06 crc kubenswrapper[4734]: I1205 23:37:06.305193 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-config\") pod \"dnsmasq-dns-698758b865-l7vfg\" (UID: \"e0c9ffcb-625f-49f8-869a-e71e5f53b92b\") " pod="openstack/dnsmasq-dns-698758b865-l7vfg" Dec 05 23:37:06 crc kubenswrapper[4734]: I1205 23:37:06.305307 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-l7vfg\" (UID: \"e0c9ffcb-625f-49f8-869a-e71e5f53b92b\") " pod="openstack/dnsmasq-dns-698758b865-l7vfg" Dec 05 23:37:06 crc kubenswrapper[4734]: I1205 23:37:06.305433 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-dns-svc\") pod \"dnsmasq-dns-698758b865-l7vfg\" (UID: \"e0c9ffcb-625f-49f8-869a-e71e5f53b92b\") " pod="openstack/dnsmasq-dns-698758b865-l7vfg" Dec 05 23:37:06 crc kubenswrapper[4734]: I1205 23:37:06.305458 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-l7vfg\" (UID: \"e0c9ffcb-625f-49f8-869a-e71e5f53b92b\") " pod="openstack/dnsmasq-dns-698758b865-l7vfg" Dec 05 23:37:06 crc kubenswrapper[4734]: I1205 23:37:06.332987 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c79lm\" (UniqueName: \"kubernetes.io/projected/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-kube-api-access-c79lm\") pod \"dnsmasq-dns-698758b865-l7vfg\" (UID: \"e0c9ffcb-625f-49f8-869a-e71e5f53b92b\") " pod="openstack/dnsmasq-dns-698758b865-l7vfg" Dec 05 23:37:06 crc kubenswrapper[4734]: I1205 23:37:06.353107 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-l7vfg" Dec 05 23:37:06 crc kubenswrapper[4734]: I1205 23:37:06.379948 4734 generic.go:334] "Generic (PLEG): container finished" podID="c28ce93a-e28b-4be0-87bd-38e5dc1383df" containerID="f9185ca68f816af79e3ee4414d88f12b53176d69b37e70ffedf32cdf1cb19599" exitCode=0 Dec 05 23:37:06 crc kubenswrapper[4734]: I1205 23:37:06.380097 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d0b4-account-create-update-qssx2" event={"ID":"c28ce93a-e28b-4be0-87bd-38e5dc1383df","Type":"ContainerDied","Data":"f9185ca68f816af79e3ee4414d88f12b53176d69b37e70ffedf32cdf1cb19599"} Dec 05 23:37:06 crc kubenswrapper[4734]: I1205 23:37:06.382003 4734 generic.go:334] "Generic (PLEG): container finished" podID="f0266b9f-86d4-462a-a20f-8897e48bfa43" containerID="7d6874eb4a8f97ea3a945ced508aaf272a8cce2f377d304d99de29b142e4fec6" exitCode=0 Dec 05 23:37:06 crc kubenswrapper[4734]: I1205 23:37:06.382104 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-n8dcb" event={"ID":"f0266b9f-86d4-462a-a20f-8897e48bfa43","Type":"ContainerDied","Data":"7d6874eb4a8f97ea3a945ced508aaf272a8cce2f377d304d99de29b142e4fec6"} Dec 05 23:37:06 crc kubenswrapper[4734]: I1205 23:37:06.386399 4734 generic.go:334] "Generic (PLEG): container finished" podID="40a48ff8-8c86-4d37-8962-bb74618c2558" containerID="64cdede5a62c626b94ec4a2cfded9bdb990ab81051608b09b39824cea9f2859c" exitCode=0 Dec 05 23:37:06 crc kubenswrapper[4734]: I1205 23:37:06.386464 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8463-account-create-update-psrz9" event={"ID":"40a48ff8-8c86-4d37-8962-bb74618c2558","Type":"ContainerDied","Data":"64cdede5a62c626b94ec4a2cfded9bdb990ab81051608b09b39824cea9f2859c"} Dec 05 23:37:06 crc kubenswrapper[4734]: I1205 23:37:06.416752 4734 generic.go:334] "Generic (PLEG): container finished" podID="3d76d646-48a4-405f-ba5e-fa7ef1775294" containerID="9b4adfba9cb2e0ec28fc8f79440f61f1e28013f2db74580cc72374cbd7221c7e" exitCode=0 Dec 05 23:37:06 crc kubenswrapper[4734]: I1205 23:37:06.417249 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xm6wd" event={"ID":"3d76d646-48a4-405f-ba5e-fa7ef1775294","Type":"ContainerDied","Data":"9b4adfba9cb2e0ec28fc8f79440f61f1e28013f2db74580cc72374cbd7221c7e"} Dec 05 23:37:06 crc kubenswrapper[4734]: I1205 23:37:06.784683 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-l7vfg"] Dec 05 23:37:06 crc kubenswrapper[4734]: W1205 23:37:06.787231 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0c9ffcb_625f_49f8_869a_e71e5f53b92b.slice/crio-1b5b286881ff51d2dbaf4ca3a824de9e524259ccaba1c9f7ac8244cffcb3345b WatchSource:0}: Error finding container 1b5b286881ff51d2dbaf4ca3a824de9e524259ccaba1c9f7ac8244cffcb3345b: Status 404 returned error can't find the container with id 1b5b286881ff51d2dbaf4ca3a824de9e524259ccaba1c9f7ac8244cffcb3345b Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.097693 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.105112 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.108265 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.108730 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.110341 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-glpvk" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.110594 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.129449 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.220655 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-lock\") pod \"swift-storage-0\" (UID: \"fea25d07-8cbc-4875-89e8-1752b0ee2a9e\") " pod="openstack/swift-storage-0" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.221090 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-cache\") pod \"swift-storage-0\" (UID: \"fea25d07-8cbc-4875-89e8-1752b0ee2a9e\") " pod="openstack/swift-storage-0" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.221310 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jxxp\" (UniqueName: \"kubernetes.io/projected/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-kube-api-access-8jxxp\") pod \"swift-storage-0\" (UID: \"fea25d07-8cbc-4875-89e8-1752b0ee2a9e\") " pod="openstack/swift-storage-0" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.221410 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-etc-swift\") pod \"swift-storage-0\" (UID: \"fea25d07-8cbc-4875-89e8-1752b0ee2a9e\") " pod="openstack/swift-storage-0" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.221607 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"fea25d07-8cbc-4875-89e8-1752b0ee2a9e\") " pod="openstack/swift-storage-0" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.323692 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-etc-swift\") pod \"swift-storage-0\" (UID: \"fea25d07-8cbc-4875-89e8-1752b0ee2a9e\") " pod="openstack/swift-storage-0" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.323769 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"fea25d07-8cbc-4875-89e8-1752b0ee2a9e\") " pod="openstack/swift-storage-0" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.323853 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-lock\") pod \"swift-storage-0\" (UID: \"fea25d07-8cbc-4875-89e8-1752b0ee2a9e\") " pod="openstack/swift-storage-0" Dec 05 23:37:07 crc kubenswrapper[4734]: E1205 23:37:07.323868 4734 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 23:37:07 crc kubenswrapper[4734]: E1205 23:37:07.323894 4734 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.323913 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-cache\") pod \"swift-storage-0\" (UID: \"fea25d07-8cbc-4875-89e8-1752b0ee2a9e\") " pod="openstack/swift-storage-0" Dec 05 23:37:07 crc kubenswrapper[4734]: E1205 23:37:07.324045 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-etc-swift podName:fea25d07-8cbc-4875-89e8-1752b0ee2a9e nodeName:}" failed. No retries permitted until 2025-12-05 23:37:07.823931314 +0000 UTC m=+1048.507335590 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-etc-swift") pod "swift-storage-0" (UID: "fea25d07-8cbc-4875-89e8-1752b0ee2a9e") : configmap "swift-ring-files" not found Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.324070 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jxxp\" (UniqueName: \"kubernetes.io/projected/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-kube-api-access-8jxxp\") pod \"swift-storage-0\" (UID: \"fea25d07-8cbc-4875-89e8-1752b0ee2a9e\") " pod="openstack/swift-storage-0" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.324217 4734 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"fea25d07-8cbc-4875-89e8-1752b0ee2a9e\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/swift-storage-0" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.324592 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-cache\") pod \"swift-storage-0\" (UID: \"fea25d07-8cbc-4875-89e8-1752b0ee2a9e\") " pod="openstack/swift-storage-0" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.324942 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-lock\") pod \"swift-storage-0\" (UID: \"fea25d07-8cbc-4875-89e8-1752b0ee2a9e\") " pod="openstack/swift-storage-0" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.348253 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jxxp\" (UniqueName: \"kubernetes.io/projected/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-kube-api-access-8jxxp\") pod \"swift-storage-0\" (UID: \"fea25d07-8cbc-4875-89e8-1752b0ee2a9e\") " pod="openstack/swift-storage-0" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.350300 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"fea25d07-8cbc-4875-89e8-1752b0ee2a9e\") " pod="openstack/swift-storage-0" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.403459 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-7kszd"] Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.405562 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7kszd" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.408609 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.408979 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.410085 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.427607 4734 generic.go:334] "Generic (PLEG): container finished" podID="e0c9ffcb-625f-49f8-869a-e71e5f53b92b" containerID="16e08f8a60eef0b7a460ab170fd2f7f8d972e23768931e83cf026c83257636ab" exitCode=0 Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.427680 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-l7vfg" event={"ID":"e0c9ffcb-625f-49f8-869a-e71e5f53b92b","Type":"ContainerDied","Data":"16e08f8a60eef0b7a460ab170fd2f7f8d972e23768931e83cf026c83257636ab"} Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.427925 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-l7vfg" event={"ID":"e0c9ffcb-625f-49f8-869a-e71e5f53b92b","Type":"ContainerStarted","Data":"1b5b286881ff51d2dbaf4ca3a824de9e524259ccaba1c9f7ac8244cffcb3345b"} Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.527519 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6534f554-83b4-4fea-8b0d-824961d655d5-scripts\") pod \"swift-ring-rebalance-7kszd\" (UID: \"6534f554-83b4-4fea-8b0d-824961d655d5\") " pod="openstack/swift-ring-rebalance-7kszd" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.527614 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6534f554-83b4-4fea-8b0d-824961d655d5-ring-data-devices\") pod \"swift-ring-rebalance-7kszd\" (UID: \"6534f554-83b4-4fea-8b0d-824961d655d5\") " pod="openstack/swift-ring-rebalance-7kszd" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.527692 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6534f554-83b4-4fea-8b0d-824961d655d5-combined-ca-bundle\") pod \"swift-ring-rebalance-7kszd\" (UID: \"6534f554-83b4-4fea-8b0d-824961d655d5\") " pod="openstack/swift-ring-rebalance-7kszd" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.527734 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6534f554-83b4-4fea-8b0d-824961d655d5-swiftconf\") pod \"swift-ring-rebalance-7kszd\" (UID: \"6534f554-83b4-4fea-8b0d-824961d655d5\") " pod="openstack/swift-ring-rebalance-7kszd" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.527785 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6534f554-83b4-4fea-8b0d-824961d655d5-etc-swift\") pod \"swift-ring-rebalance-7kszd\" (UID: \"6534f554-83b4-4fea-8b0d-824961d655d5\") " pod="openstack/swift-ring-rebalance-7kszd" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.527849 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6534f554-83b4-4fea-8b0d-824961d655d5-dispersionconf\") pod \"swift-ring-rebalance-7kszd\" (UID: \"6534f554-83b4-4fea-8b0d-824961d655d5\") " pod="openstack/swift-ring-rebalance-7kszd" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.527874 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfpsd\" (UniqueName: \"kubernetes.io/projected/6534f554-83b4-4fea-8b0d-824961d655d5-kube-api-access-dfpsd\") pod \"swift-ring-rebalance-7kszd\" (UID: \"6534f554-83b4-4fea-8b0d-824961d655d5\") " pod="openstack/swift-ring-rebalance-7kszd" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.532650 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-7kszd"] Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.553617 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-qdl57"] Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.555290 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qdl57" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.631831 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6534f554-83b4-4fea-8b0d-824961d655d5-ring-data-devices\") pod \"swift-ring-rebalance-7kszd\" (UID: \"6534f554-83b4-4fea-8b0d-824961d655d5\") " pod="openstack/swift-ring-rebalance-7kszd" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.631898 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-swiftconf\") pod \"swift-ring-rebalance-qdl57\" (UID: \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\") " pod="openstack/swift-ring-rebalance-qdl57" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.631945 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-scripts\") pod \"swift-ring-rebalance-qdl57\" (UID: \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\") " pod="openstack/swift-ring-rebalance-qdl57" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.632000 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-dispersionconf\") pod \"swift-ring-rebalance-qdl57\" (UID: \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\") " pod="openstack/swift-ring-rebalance-qdl57" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.632020 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6534f554-83b4-4fea-8b0d-824961d655d5-combined-ca-bundle\") pod \"swift-ring-rebalance-7kszd\" (UID: \"6534f554-83b4-4fea-8b0d-824961d655d5\") " pod="openstack/swift-ring-rebalance-7kszd" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.632086 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6534f554-83b4-4fea-8b0d-824961d655d5-swiftconf\") pod \"swift-ring-rebalance-7kszd\" (UID: \"6534f554-83b4-4fea-8b0d-824961d655d5\") " pod="openstack/swift-ring-rebalance-7kszd" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.632124 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-etc-swift\") pod \"swift-ring-rebalance-qdl57\" (UID: \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\") " pod="openstack/swift-ring-rebalance-qdl57" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.632144 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-combined-ca-bundle\") pod \"swift-ring-rebalance-qdl57\" (UID: \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\") " pod="openstack/swift-ring-rebalance-qdl57" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.634040 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6534f554-83b4-4fea-8b0d-824961d655d5-ring-data-devices\") pod \"swift-ring-rebalance-7kszd\" (UID: \"6534f554-83b4-4fea-8b0d-824961d655d5\") " pod="openstack/swift-ring-rebalance-7kszd" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.641122 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6534f554-83b4-4fea-8b0d-824961d655d5-etc-swift\") pod \"swift-ring-rebalance-7kszd\" (UID: \"6534f554-83b4-4fea-8b0d-824961d655d5\") " pod="openstack/swift-ring-rebalance-7kszd" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.641202 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-7kszd"] Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.645951 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6534f554-83b4-4fea-8b0d-824961d655d5-etc-swift\") pod \"swift-ring-rebalance-7kszd\" (UID: \"6534f554-83b4-4fea-8b0d-824961d655d5\") " pod="openstack/swift-ring-rebalance-7kszd" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.646167 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ks6x\" (UniqueName: \"kubernetes.io/projected/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-kube-api-access-5ks6x\") pod \"swift-ring-rebalance-qdl57\" (UID: \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\") " pod="openstack/swift-ring-rebalance-qdl57" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.646277 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6534f554-83b4-4fea-8b0d-824961d655d5-dispersionconf\") pod \"swift-ring-rebalance-7kszd\" (UID: \"6534f554-83b4-4fea-8b0d-824961d655d5\") " pod="openstack/swift-ring-rebalance-7kszd" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.646309 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-ring-data-devices\") pod \"swift-ring-rebalance-qdl57\" (UID: \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\") " pod="openstack/swift-ring-rebalance-qdl57" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.646372 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfpsd\" (UniqueName: \"kubernetes.io/projected/6534f554-83b4-4fea-8b0d-824961d655d5-kube-api-access-dfpsd\") pod \"swift-ring-rebalance-7kszd\" (UID: \"6534f554-83b4-4fea-8b0d-824961d655d5\") " pod="openstack/swift-ring-rebalance-7kszd" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.646436 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6534f554-83b4-4fea-8b0d-824961d655d5-scripts\") pod \"swift-ring-rebalance-7kszd\" (UID: \"6534f554-83b4-4fea-8b0d-824961d655d5\") " pod="openstack/swift-ring-rebalance-7kszd" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.647388 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6534f554-83b4-4fea-8b0d-824961d655d5-scripts\") pod \"swift-ring-rebalance-7kszd\" (UID: \"6534f554-83b4-4fea-8b0d-824961d655d5\") " pod="openstack/swift-ring-rebalance-7kszd" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.652801 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6534f554-83b4-4fea-8b0d-824961d655d5-combined-ca-bundle\") pod \"swift-ring-rebalance-7kszd\" (UID: \"6534f554-83b4-4fea-8b0d-824961d655d5\") " pod="openstack/swift-ring-rebalance-7kszd" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.662271 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6534f554-83b4-4fea-8b0d-824961d655d5-swiftconf\") pod \"swift-ring-rebalance-7kszd\" (UID: \"6534f554-83b4-4fea-8b0d-824961d655d5\") " pod="openstack/swift-ring-rebalance-7kszd" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.664338 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6534f554-83b4-4fea-8b0d-824961d655d5-dispersionconf\") pod \"swift-ring-rebalance-7kszd\" (UID: \"6534f554-83b4-4fea-8b0d-824961d655d5\") " pod="openstack/swift-ring-rebalance-7kszd" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.757114 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-swiftconf\") pod \"swift-ring-rebalance-qdl57\" (UID: \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\") " pod="openstack/swift-ring-rebalance-qdl57" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.757206 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-scripts\") pod \"swift-ring-rebalance-qdl57\" (UID: \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\") " pod="openstack/swift-ring-rebalance-qdl57" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.757348 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-dispersionconf\") pod \"swift-ring-rebalance-qdl57\" (UID: \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\") " pod="openstack/swift-ring-rebalance-qdl57" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.757507 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-etc-swift\") pod \"swift-ring-rebalance-qdl57\" (UID: \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\") " pod="openstack/swift-ring-rebalance-qdl57" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.757563 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-combined-ca-bundle\") pod \"swift-ring-rebalance-qdl57\" (UID: \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\") " pod="openstack/swift-ring-rebalance-qdl57" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.757687 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ks6x\" (UniqueName: \"kubernetes.io/projected/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-kube-api-access-5ks6x\") pod \"swift-ring-rebalance-qdl57\" (UID: \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\") " pod="openstack/swift-ring-rebalance-qdl57" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.757772 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-ring-data-devices\") pod \"swift-ring-rebalance-qdl57\" (UID: \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\") " pod="openstack/swift-ring-rebalance-qdl57" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.759360 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfpsd\" (UniqueName: \"kubernetes.io/projected/6534f554-83b4-4fea-8b0d-824961d655d5-kube-api-access-dfpsd\") pod \"swift-ring-rebalance-7kszd\" (UID: \"6534f554-83b4-4fea-8b0d-824961d655d5\") " pod="openstack/swift-ring-rebalance-7kszd" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.764810 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-swiftconf\") pod \"swift-ring-rebalance-qdl57\" (UID: \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\") " pod="openstack/swift-ring-rebalance-qdl57" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.765518 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-scripts\") pod \"swift-ring-rebalance-qdl57\" (UID: \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\") " pod="openstack/swift-ring-rebalance-qdl57" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.799235 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-combined-ca-bundle\") pod \"swift-ring-rebalance-qdl57\" (UID: \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\") " pod="openstack/swift-ring-rebalance-qdl57" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.806485 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-ring-data-devices\") pod \"swift-ring-rebalance-qdl57\" (UID: \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\") " pod="openstack/swift-ring-rebalance-qdl57" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.808398 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-qdl57"] Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.826299 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-dispersionconf\") pod \"swift-ring-rebalance-qdl57\" (UID: \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\") " pod="openstack/swift-ring-rebalance-qdl57" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.827582 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-etc-swift\") pod \"swift-ring-rebalance-qdl57\" (UID: \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\") " pod="openstack/swift-ring-rebalance-qdl57" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.850440 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ks6x\" (UniqueName: \"kubernetes.io/projected/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-kube-api-access-5ks6x\") pod \"swift-ring-rebalance-qdl57\" (UID: \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\") " pod="openstack/swift-ring-rebalance-qdl57" Dec 05 23:37:07 crc kubenswrapper[4734]: E1205 23:37:07.918840 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-dfpsd], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-7kszd" podUID="6534f554-83b4-4fea-8b0d-824961d655d5" Dec 05 23:37:07 crc kubenswrapper[4734]: I1205 23:37:07.922674 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-etc-swift\") pod \"swift-storage-0\" (UID: \"fea25d07-8cbc-4875-89e8-1752b0ee2a9e\") " pod="openstack/swift-storage-0" Dec 05 23:37:07 crc kubenswrapper[4734]: E1205 23:37:07.925880 4734 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 23:37:07 crc kubenswrapper[4734]: E1205 23:37:07.925924 4734 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 23:37:07 crc kubenswrapper[4734]: E1205 23:37:07.925987 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-etc-swift podName:fea25d07-8cbc-4875-89e8-1752b0ee2a9e nodeName:}" failed. No retries permitted until 2025-12-05 23:37:08.925965025 +0000 UTC m=+1049.609369301 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-etc-swift") pod "swift-storage-0" (UID: "fea25d07-8cbc-4875-89e8-1752b0ee2a9e") : configmap "swift-ring-files" not found Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.041811 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qdl57" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.274660 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-n8dcb" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.335180 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqm22\" (UniqueName: \"kubernetes.io/projected/f0266b9f-86d4-462a-a20f-8897e48bfa43-kube-api-access-tqm22\") pod \"f0266b9f-86d4-462a-a20f-8897e48bfa43\" (UID: \"f0266b9f-86d4-462a-a20f-8897e48bfa43\") " Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.335337 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0266b9f-86d4-462a-a20f-8897e48bfa43-operator-scripts\") pod \"f0266b9f-86d4-462a-a20f-8897e48bfa43\" (UID: \"f0266b9f-86d4-462a-a20f-8897e48bfa43\") " Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.336449 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0266b9f-86d4-462a-a20f-8897e48bfa43-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f0266b9f-86d4-462a-a20f-8897e48bfa43" (UID: "f0266b9f-86d4-462a-a20f-8897e48bfa43"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.342971 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0266b9f-86d4-462a-a20f-8897e48bfa43-kube-api-access-tqm22" (OuterVolumeSpecName: "kube-api-access-tqm22") pod "f0266b9f-86d4-462a-a20f-8897e48bfa43" (UID: "f0266b9f-86d4-462a-a20f-8897e48bfa43"). InnerVolumeSpecName "kube-api-access-tqm22". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.438108 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqm22\" (UniqueName: \"kubernetes.io/projected/f0266b9f-86d4-462a-a20f-8897e48bfa43-kube-api-access-tqm22\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.438156 4734 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0266b9f-86d4-462a-a20f-8897e48bfa43-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.445723 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-n8dcb" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.445756 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-n8dcb" event={"ID":"f0266b9f-86d4-462a-a20f-8897e48bfa43","Type":"ContainerDied","Data":"a5a62de4c5f33235629c82b7c070df100b3820f5990533b894a942da10a4a028"} Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.445825 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5a62de4c5f33235629c82b7c070df100b3820f5990533b894a942da10a4a028" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.445717 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7kszd" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.460774 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7kszd" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.539584 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6534f554-83b4-4fea-8b0d-824961d655d5-swiftconf\") pod \"6534f554-83b4-4fea-8b0d-824961d655d5\" (UID: \"6534f554-83b4-4fea-8b0d-824961d655d5\") " Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.539712 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6534f554-83b4-4fea-8b0d-824961d655d5-combined-ca-bundle\") pod \"6534f554-83b4-4fea-8b0d-824961d655d5\" (UID: \"6534f554-83b4-4fea-8b0d-824961d655d5\") " Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.539778 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6534f554-83b4-4fea-8b0d-824961d655d5-dispersionconf\") pod \"6534f554-83b4-4fea-8b0d-824961d655d5\" (UID: \"6534f554-83b4-4fea-8b0d-824961d655d5\") " Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.539859 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6534f554-83b4-4fea-8b0d-824961d655d5-scripts\") pod \"6534f554-83b4-4fea-8b0d-824961d655d5\" (UID: \"6534f554-83b4-4fea-8b0d-824961d655d5\") " Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.539908 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6534f554-83b4-4fea-8b0d-824961d655d5-ring-data-devices\") pod \"6534f554-83b4-4fea-8b0d-824961d655d5\" (UID: \"6534f554-83b4-4fea-8b0d-824961d655d5\") " Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.539995 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfpsd\" (UniqueName: \"kubernetes.io/projected/6534f554-83b4-4fea-8b0d-824961d655d5-kube-api-access-dfpsd\") pod \"6534f554-83b4-4fea-8b0d-824961d655d5\" (UID: \"6534f554-83b4-4fea-8b0d-824961d655d5\") " Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.540076 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6534f554-83b4-4fea-8b0d-824961d655d5-etc-swift\") pod \"6534f554-83b4-4fea-8b0d-824961d655d5\" (UID: \"6534f554-83b4-4fea-8b0d-824961d655d5\") " Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.540905 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6534f554-83b4-4fea-8b0d-824961d655d5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6534f554-83b4-4fea-8b0d-824961d655d5" (UID: "6534f554-83b4-4fea-8b0d-824961d655d5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.540929 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6534f554-83b4-4fea-8b0d-824961d655d5-scripts" (OuterVolumeSpecName: "scripts") pod "6534f554-83b4-4fea-8b0d-824961d655d5" (UID: "6534f554-83b4-4fea-8b0d-824961d655d5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.541274 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6534f554-83b4-4fea-8b0d-824961d655d5-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "6534f554-83b4-4fea-8b0d-824961d655d5" (UID: "6534f554-83b4-4fea-8b0d-824961d655d5"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.544858 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6534f554-83b4-4fea-8b0d-824961d655d5-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "6534f554-83b4-4fea-8b0d-824961d655d5" (UID: "6534f554-83b4-4fea-8b0d-824961d655d5"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.545047 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6534f554-83b4-4fea-8b0d-824961d655d5-kube-api-access-dfpsd" (OuterVolumeSpecName: "kube-api-access-dfpsd") pod "6534f554-83b4-4fea-8b0d-824961d655d5" (UID: "6534f554-83b4-4fea-8b0d-824961d655d5"). InnerVolumeSpecName "kube-api-access-dfpsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.546928 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6534f554-83b4-4fea-8b0d-824961d655d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6534f554-83b4-4fea-8b0d-824961d655d5" (UID: "6534f554-83b4-4fea-8b0d-824961d655d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.549302 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6534f554-83b4-4fea-8b0d-824961d655d5-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "6534f554-83b4-4fea-8b0d-824961d655d5" (UID: "6534f554-83b4-4fea-8b0d-824961d655d5"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.609787 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xm6wd" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.629404 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d0b4-account-create-update-qssx2" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.639461 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8463-account-create-update-psrz9" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.644218 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c28ce93a-e28b-4be0-87bd-38e5dc1383df-operator-scripts\") pod \"c28ce93a-e28b-4be0-87bd-38e5dc1383df\" (UID: \"c28ce93a-e28b-4be0-87bd-38e5dc1383df\") " Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.644282 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d76d646-48a4-405f-ba5e-fa7ef1775294-operator-scripts\") pod \"3d76d646-48a4-405f-ba5e-fa7ef1775294\" (UID: \"3d76d646-48a4-405f-ba5e-fa7ef1775294\") " Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.644435 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-986h4\" (UniqueName: \"kubernetes.io/projected/3d76d646-48a4-405f-ba5e-fa7ef1775294-kube-api-access-986h4\") pod \"3d76d646-48a4-405f-ba5e-fa7ef1775294\" (UID: \"3d76d646-48a4-405f-ba5e-fa7ef1775294\") " Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.644472 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2s4tw\" (UniqueName: \"kubernetes.io/projected/c28ce93a-e28b-4be0-87bd-38e5dc1383df-kube-api-access-2s4tw\") pod \"c28ce93a-e28b-4be0-87bd-38e5dc1383df\" (UID: \"c28ce93a-e28b-4be0-87bd-38e5dc1383df\") " Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.644796 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c28ce93a-e28b-4be0-87bd-38e5dc1383df-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c28ce93a-e28b-4be0-87bd-38e5dc1383df" (UID: "c28ce93a-e28b-4be0-87bd-38e5dc1383df"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.645252 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d76d646-48a4-405f-ba5e-fa7ef1775294-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d76d646-48a4-405f-ba5e-fa7ef1775294" (UID: "3d76d646-48a4-405f-ba5e-fa7ef1775294"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.647959 4734 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6534f554-83b4-4fea-8b0d-824961d655d5-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.648022 4734 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6534f554-83b4-4fea-8b0d-824961d655d5-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.648033 4734 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6534f554-83b4-4fea-8b0d-824961d655d5-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.648044 4734 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c28ce93a-e28b-4be0-87bd-38e5dc1383df-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.648053 4734 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d76d646-48a4-405f-ba5e-fa7ef1775294-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.648063 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfpsd\" (UniqueName: \"kubernetes.io/projected/6534f554-83b4-4fea-8b0d-824961d655d5-kube-api-access-dfpsd\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.648075 4734 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6534f554-83b4-4fea-8b0d-824961d655d5-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.648105 4734 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6534f554-83b4-4fea-8b0d-824961d655d5-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.648116 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6534f554-83b4-4fea-8b0d-824961d655d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.653055 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c28ce93a-e28b-4be0-87bd-38e5dc1383df-kube-api-access-2s4tw" (OuterVolumeSpecName: "kube-api-access-2s4tw") pod "c28ce93a-e28b-4be0-87bd-38e5dc1383df" (UID: "c28ce93a-e28b-4be0-87bd-38e5dc1383df"). InnerVolumeSpecName "kube-api-access-2s4tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.662798 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d76d646-48a4-405f-ba5e-fa7ef1775294-kube-api-access-986h4" (OuterVolumeSpecName: "kube-api-access-986h4") pod "3d76d646-48a4-405f-ba5e-fa7ef1775294" (UID: "3d76d646-48a4-405f-ba5e-fa7ef1775294"). InnerVolumeSpecName "kube-api-access-986h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.704921 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-qdl57"] Dec 05 23:37:08 crc kubenswrapper[4734]: W1205 23:37:08.712768 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1e03821_b44b_4ce9_8fb9_6831bf8b087f.slice/crio-64b309db10c29093707891d0d3136a34c9af13cd98b38b040ab4b50d69e25718 WatchSource:0}: Error finding container 64b309db10c29093707891d0d3136a34c9af13cd98b38b040ab4b50d69e25718: Status 404 returned error can't find the container with id 64b309db10c29093707891d0d3136a34c9af13cd98b38b040ab4b50d69e25718 Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.749671 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40a48ff8-8c86-4d37-8962-bb74618c2558-operator-scripts\") pod \"40a48ff8-8c86-4d37-8962-bb74618c2558\" (UID: \"40a48ff8-8c86-4d37-8962-bb74618c2558\") " Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.749872 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpg6z\" (UniqueName: \"kubernetes.io/projected/40a48ff8-8c86-4d37-8962-bb74618c2558-kube-api-access-kpg6z\") pod \"40a48ff8-8c86-4d37-8962-bb74618c2558\" (UID: \"40a48ff8-8c86-4d37-8962-bb74618c2558\") " Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.750267 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40a48ff8-8c86-4d37-8962-bb74618c2558-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "40a48ff8-8c86-4d37-8962-bb74618c2558" (UID: "40a48ff8-8c86-4d37-8962-bb74618c2558"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.750392 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-986h4\" (UniqueName: \"kubernetes.io/projected/3d76d646-48a4-405f-ba5e-fa7ef1775294-kube-api-access-986h4\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.750415 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2s4tw\" (UniqueName: \"kubernetes.io/projected/c28ce93a-e28b-4be0-87bd-38e5dc1383df-kube-api-access-2s4tw\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.756259 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40a48ff8-8c86-4d37-8962-bb74618c2558-kube-api-access-kpg6z" (OuterVolumeSpecName: "kube-api-access-kpg6z") pod "40a48ff8-8c86-4d37-8962-bb74618c2558" (UID: "40a48ff8-8c86-4d37-8962-bb74618c2558"). InnerVolumeSpecName "kube-api-access-kpg6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.852771 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpg6z\" (UniqueName: \"kubernetes.io/projected/40a48ff8-8c86-4d37-8962-bb74618c2558-kube-api-access-kpg6z\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.852815 4734 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40a48ff8-8c86-4d37-8962-bb74618c2558-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:08 crc kubenswrapper[4734]: I1205 23:37:08.954934 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-etc-swift\") pod \"swift-storage-0\" (UID: \"fea25d07-8cbc-4875-89e8-1752b0ee2a9e\") " pod="openstack/swift-storage-0" Dec 05 23:37:08 crc kubenswrapper[4734]: E1205 23:37:08.955159 4734 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 23:37:08 crc kubenswrapper[4734]: E1205 23:37:08.955189 4734 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 23:37:08 crc kubenswrapper[4734]: E1205 23:37:08.955289 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-etc-swift podName:fea25d07-8cbc-4875-89e8-1752b0ee2a9e nodeName:}" failed. No retries permitted until 2025-12-05 23:37:10.955242764 +0000 UTC m=+1051.638647040 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-etc-swift") pod "swift-storage-0" (UID: "fea25d07-8cbc-4875-89e8-1752b0ee2a9e") : configmap "swift-ring-files" not found Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.455892 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d0b4-account-create-update-qssx2" event={"ID":"c28ce93a-e28b-4be0-87bd-38e5dc1383df","Type":"ContainerDied","Data":"99b68f4124008aeecdffb3d8201a58c2948e277850ec891bb6e7db77bb878dc3"} Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.456406 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99b68f4124008aeecdffb3d8201a58c2948e277850ec891bb6e7db77bb878dc3" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.455922 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d0b4-account-create-update-qssx2" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.458333 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qdl57" event={"ID":"a1e03821-b44b-4ce9-8fb9-6831bf8b087f","Type":"ContainerStarted","Data":"64b309db10c29093707891d0d3136a34c9af13cd98b38b040ab4b50d69e25718"} Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.460803 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8463-account-create-update-psrz9" event={"ID":"40a48ff8-8c86-4d37-8962-bb74618c2558","Type":"ContainerDied","Data":"f24e000e1656242c3aeedb68fb7e261acf111fd6c0f2e36e3d18e16eb42e9034"} Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.460844 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f24e000e1656242c3aeedb68fb7e261acf111fd6c0f2e36e3d18e16eb42e9034" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.460809 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8463-account-create-update-psrz9" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.462791 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xm6wd" event={"ID":"3d76d646-48a4-405f-ba5e-fa7ef1775294","Type":"ContainerDied","Data":"03f8753325fbeddd6665368095188b20dbf055c070838b4e1b601dac8931f5f1"} Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.462892 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03f8753325fbeddd6665368095188b20dbf055c070838b4e1b601dac8931f5f1" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.462849 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xm6wd" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.462808 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7kszd" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.525399 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-7kszd"] Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.542730 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-7kszd"] Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.554872 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-s269z"] Dec 05 23:37:09 crc kubenswrapper[4734]: E1205 23:37:09.555953 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40a48ff8-8c86-4d37-8962-bb74618c2558" containerName="mariadb-account-create-update" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.556084 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="40a48ff8-8c86-4d37-8962-bb74618c2558" containerName="mariadb-account-create-update" Dec 05 23:37:09 crc kubenswrapper[4734]: E1205 23:37:09.556149 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c28ce93a-e28b-4be0-87bd-38e5dc1383df" containerName="mariadb-account-create-update" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.556214 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="c28ce93a-e28b-4be0-87bd-38e5dc1383df" containerName="mariadb-account-create-update" Dec 05 23:37:09 crc kubenswrapper[4734]: E1205 23:37:09.556291 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d76d646-48a4-405f-ba5e-fa7ef1775294" containerName="mariadb-database-create" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.556351 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d76d646-48a4-405f-ba5e-fa7ef1775294" containerName="mariadb-database-create" Dec 05 23:37:09 crc kubenswrapper[4734]: E1205 23:37:09.556415 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0266b9f-86d4-462a-a20f-8897e48bfa43" containerName="mariadb-database-create" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.556472 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0266b9f-86d4-462a-a20f-8897e48bfa43" containerName="mariadb-database-create" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.556735 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d76d646-48a4-405f-ba5e-fa7ef1775294" containerName="mariadb-database-create" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.556809 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0266b9f-86d4-462a-a20f-8897e48bfa43" containerName="mariadb-database-create" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.556868 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="40a48ff8-8c86-4d37-8962-bb74618c2558" containerName="mariadb-account-create-update" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.556922 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="c28ce93a-e28b-4be0-87bd-38e5dc1383df" containerName="mariadb-account-create-update" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.557674 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-s269z" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.567871 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-s269z"] Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.626787 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6534f554-83b4-4fea-8b0d-824961d655d5" path="/var/lib/kubelet/pods/6534f554-83b4-4fea-8b0d-824961d655d5/volumes" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.669608 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2msr\" (UniqueName: \"kubernetes.io/projected/9614c3a8-524e-4641-9abb-a991a9c884ae-kube-api-access-n2msr\") pod \"glance-db-create-s269z\" (UID: \"9614c3a8-524e-4641-9abb-a991a9c884ae\") " pod="openstack/glance-db-create-s269z" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.669753 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9614c3a8-524e-4641-9abb-a991a9c884ae-operator-scripts\") pod \"glance-db-create-s269z\" (UID: \"9614c3a8-524e-4641-9abb-a991a9c884ae\") " pod="openstack/glance-db-create-s269z" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.754348 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-351c-account-create-update-npg5f"] Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.756237 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-351c-account-create-update-npg5f" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.758834 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.773022 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/027d4639-edeb-422d-b5c5-f8ecfcd704dd-operator-scripts\") pod \"glance-351c-account-create-update-npg5f\" (UID: \"027d4639-edeb-422d-b5c5-f8ecfcd704dd\") " pod="openstack/glance-351c-account-create-update-npg5f" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.773460 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfnxd\" (UniqueName: \"kubernetes.io/projected/027d4639-edeb-422d-b5c5-f8ecfcd704dd-kube-api-access-gfnxd\") pod \"glance-351c-account-create-update-npg5f\" (UID: \"027d4639-edeb-422d-b5c5-f8ecfcd704dd\") " pod="openstack/glance-351c-account-create-update-npg5f" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.773609 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2msr\" (UniqueName: \"kubernetes.io/projected/9614c3a8-524e-4641-9abb-a991a9c884ae-kube-api-access-n2msr\") pod \"glance-db-create-s269z\" (UID: \"9614c3a8-524e-4641-9abb-a991a9c884ae\") " pod="openstack/glance-db-create-s269z" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.773754 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9614c3a8-524e-4641-9abb-a991a9c884ae-operator-scripts\") pod \"glance-db-create-s269z\" (UID: \"9614c3a8-524e-4641-9abb-a991a9c884ae\") " pod="openstack/glance-db-create-s269z" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.774858 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9614c3a8-524e-4641-9abb-a991a9c884ae-operator-scripts\") pod \"glance-db-create-s269z\" (UID: \"9614c3a8-524e-4641-9abb-a991a9c884ae\") " pod="openstack/glance-db-create-s269z" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.787759 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-351c-account-create-update-npg5f"] Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.797085 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2msr\" (UniqueName: \"kubernetes.io/projected/9614c3a8-524e-4641-9abb-a991a9c884ae-kube-api-access-n2msr\") pod \"glance-db-create-s269z\" (UID: \"9614c3a8-524e-4641-9abb-a991a9c884ae\") " pod="openstack/glance-db-create-s269z" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.876424 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-s269z" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.877307 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/027d4639-edeb-422d-b5c5-f8ecfcd704dd-operator-scripts\") pod \"glance-351c-account-create-update-npg5f\" (UID: \"027d4639-edeb-422d-b5c5-f8ecfcd704dd\") " pod="openstack/glance-351c-account-create-update-npg5f" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.877406 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfnxd\" (UniqueName: \"kubernetes.io/projected/027d4639-edeb-422d-b5c5-f8ecfcd704dd-kube-api-access-gfnxd\") pod \"glance-351c-account-create-update-npg5f\" (UID: \"027d4639-edeb-422d-b5c5-f8ecfcd704dd\") " pod="openstack/glance-351c-account-create-update-npg5f" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.878702 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/027d4639-edeb-422d-b5c5-f8ecfcd704dd-operator-scripts\") pod \"glance-351c-account-create-update-npg5f\" (UID: \"027d4639-edeb-422d-b5c5-f8ecfcd704dd\") " pod="openstack/glance-351c-account-create-update-npg5f" Dec 05 23:37:09 crc kubenswrapper[4734]: I1205 23:37:09.900363 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfnxd\" (UniqueName: \"kubernetes.io/projected/027d4639-edeb-422d-b5c5-f8ecfcd704dd-kube-api-access-gfnxd\") pod \"glance-351c-account-create-update-npg5f\" (UID: \"027d4639-edeb-422d-b5c5-f8ecfcd704dd\") " pod="openstack/glance-351c-account-create-update-npg5f" Dec 05 23:37:10 crc kubenswrapper[4734]: I1205 23:37:10.076699 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-351c-account-create-update-npg5f" Dec 05 23:37:10 crc kubenswrapper[4734]: I1205 23:37:10.368048 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-s269z"] Dec 05 23:37:10 crc kubenswrapper[4734]: W1205 23:37:10.379382 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9614c3a8_524e_4641_9abb_a991a9c884ae.slice/crio-76ad6a123346d4da7ebe0e7572f0ebf0395f8443f1157b982d20f8b8618e7bd6 WatchSource:0}: Error finding container 76ad6a123346d4da7ebe0e7572f0ebf0395f8443f1157b982d20f8b8618e7bd6: Status 404 returned error can't find the container with id 76ad6a123346d4da7ebe0e7572f0ebf0395f8443f1157b982d20f8b8618e7bd6 Dec 05 23:37:10 crc kubenswrapper[4734]: I1205 23:37:10.488936 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-l7vfg" event={"ID":"e0c9ffcb-625f-49f8-869a-e71e5f53b92b","Type":"ContainerStarted","Data":"d3435168d1c4d378de9e64845e024fdbc2764b4dbac12d6383881b6a4ac18b30"} Dec 05 23:37:10 crc kubenswrapper[4734]: I1205 23:37:10.489824 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-l7vfg" Dec 05 23:37:10 crc kubenswrapper[4734]: I1205 23:37:10.491305 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-s269z" event={"ID":"9614c3a8-524e-4641-9abb-a991a9c884ae","Type":"ContainerStarted","Data":"76ad6a123346d4da7ebe0e7572f0ebf0395f8443f1157b982d20f8b8618e7bd6"} Dec 05 23:37:10 crc kubenswrapper[4734]: I1205 23:37:10.517425 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-l7vfg" podStartSLOduration=5.517395478 podStartE2EDuration="5.517395478s" podCreationTimestamp="2025-12-05 23:37:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:37:10.511334412 +0000 UTC m=+1051.194738688" watchObservedRunningTime="2025-12-05 23:37:10.517395478 +0000 UTC m=+1051.200799754" Dec 05 23:37:10 crc kubenswrapper[4734]: I1205 23:37:10.576841 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-351c-account-create-update-npg5f"] Dec 05 23:37:10 crc kubenswrapper[4734]: W1205 23:37:10.591871 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod027d4639_edeb_422d_b5c5_f8ecfcd704dd.slice/crio-059df66b59f735c251fa79475f85265a0799614543da2c953c1e7d060201467a WatchSource:0}: Error finding container 059df66b59f735c251fa79475f85265a0799614543da2c953c1e7d060201467a: Status 404 returned error can't find the container with id 059df66b59f735c251fa79475f85265a0799614543da2c953c1e7d060201467a Dec 05 23:37:11 crc kubenswrapper[4734]: I1205 23:37:11.005472 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-etc-swift\") pod \"swift-storage-0\" (UID: \"fea25d07-8cbc-4875-89e8-1752b0ee2a9e\") " pod="openstack/swift-storage-0" Dec 05 23:37:11 crc kubenswrapper[4734]: E1205 23:37:11.006116 4734 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 23:37:11 crc kubenswrapper[4734]: E1205 23:37:11.006133 4734 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 23:37:11 crc kubenswrapper[4734]: E1205 23:37:11.006187 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-etc-swift podName:fea25d07-8cbc-4875-89e8-1752b0ee2a9e nodeName:}" failed. No retries permitted until 2025-12-05 23:37:15.006169526 +0000 UTC m=+1055.689573802 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-etc-swift") pod "swift-storage-0" (UID: "fea25d07-8cbc-4875-89e8-1752b0ee2a9e") : configmap "swift-ring-files" not found Dec 05 23:37:11 crc kubenswrapper[4734]: I1205 23:37:11.508648 4734 generic.go:334] "Generic (PLEG): container finished" podID="9614c3a8-524e-4641-9abb-a991a9c884ae" containerID="a74d873d0501707e58d8df6b4cc5aa70659cc956a4c62572ff2ec301f126c5ec" exitCode=0 Dec 05 23:37:11 crc kubenswrapper[4734]: I1205 23:37:11.508723 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-s269z" event={"ID":"9614c3a8-524e-4641-9abb-a991a9c884ae","Type":"ContainerDied","Data":"a74d873d0501707e58d8df6b4cc5aa70659cc956a4c62572ff2ec301f126c5ec"} Dec 05 23:37:11 crc kubenswrapper[4734]: I1205 23:37:11.511940 4734 generic.go:334] "Generic (PLEG): container finished" podID="027d4639-edeb-422d-b5c5-f8ecfcd704dd" containerID="cdac87b363b6aea38969a00571152c140ad2cb5e61e04cec9c43b444ed891cbf" exitCode=0 Dec 05 23:37:11 crc kubenswrapper[4734]: I1205 23:37:11.513090 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-351c-account-create-update-npg5f" event={"ID":"027d4639-edeb-422d-b5c5-f8ecfcd704dd","Type":"ContainerDied","Data":"cdac87b363b6aea38969a00571152c140ad2cb5e61e04cec9c43b444ed891cbf"} Dec 05 23:37:11 crc kubenswrapper[4734]: I1205 23:37:11.513158 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-351c-account-create-update-npg5f" event={"ID":"027d4639-edeb-422d-b5c5-f8ecfcd704dd","Type":"ContainerStarted","Data":"059df66b59f735c251fa79475f85265a0799614543da2c953c1e7d060201467a"} Dec 05 23:37:14 crc kubenswrapper[4734]: I1205 23:37:14.555827 4734 generic.go:334] "Generic (PLEG): container finished" podID="c35eaa12-d993-4769-975b-35a5ac6609e0" containerID="c5950e0647bc09668aa5bb962f9d7316a20f3b0c1e9e76366f98c21ef1804c9e" exitCode=0 Dec 05 23:37:14 crc kubenswrapper[4734]: I1205 23:37:14.556030 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c35eaa12-d993-4769-975b-35a5ac6609e0","Type":"ContainerDied","Data":"c5950e0647bc09668aa5bb962f9d7316a20f3b0c1e9e76366f98c21ef1804c9e"} Dec 05 23:37:14 crc kubenswrapper[4734]: I1205 23:37:14.563375 4734 generic.go:334] "Generic (PLEG): container finished" podID="ed95027c-1ded-4127-a341-7ee81018d4b6" containerID="66e5a249cf9e8b0a22292ba791d1aa360ef84159f879636f2af4ddcac64c1e31" exitCode=0 Dec 05 23:37:14 crc kubenswrapper[4734]: I1205 23:37:14.563519 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ed95027c-1ded-4127-a341-7ee81018d4b6","Type":"ContainerDied","Data":"66e5a249cf9e8b0a22292ba791d1aa360ef84159f879636f2af4ddcac64c1e31"} Dec 05 23:37:14 crc kubenswrapper[4734]: I1205 23:37:14.565964 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-351c-account-create-update-npg5f" event={"ID":"027d4639-edeb-422d-b5c5-f8ecfcd704dd","Type":"ContainerDied","Data":"059df66b59f735c251fa79475f85265a0799614543da2c953c1e7d060201467a"} Dec 05 23:37:14 crc kubenswrapper[4734]: I1205 23:37:14.566028 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="059df66b59f735c251fa79475f85265a0799614543da2c953c1e7d060201467a" Dec 05 23:37:14 crc kubenswrapper[4734]: I1205 23:37:14.569732 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-s269z" event={"ID":"9614c3a8-524e-4641-9abb-a991a9c884ae","Type":"ContainerDied","Data":"76ad6a123346d4da7ebe0e7572f0ebf0395f8443f1157b982d20f8b8618e7bd6"} Dec 05 23:37:14 crc kubenswrapper[4734]: I1205 23:37:14.569783 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76ad6a123346d4da7ebe0e7572f0ebf0395f8443f1157b982d20f8b8618e7bd6" Dec 05 23:37:14 crc kubenswrapper[4734]: I1205 23:37:14.691879 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-351c-account-create-update-npg5f" Dec 05 23:37:14 crc kubenswrapper[4734]: I1205 23:37:14.711698 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-s269z" Dec 05 23:37:14 crc kubenswrapper[4734]: I1205 23:37:14.799690 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9614c3a8-524e-4641-9abb-a991a9c884ae-operator-scripts\") pod \"9614c3a8-524e-4641-9abb-a991a9c884ae\" (UID: \"9614c3a8-524e-4641-9abb-a991a9c884ae\") " Dec 05 23:37:14 crc kubenswrapper[4734]: I1205 23:37:14.799800 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/027d4639-edeb-422d-b5c5-f8ecfcd704dd-operator-scripts\") pod \"027d4639-edeb-422d-b5c5-f8ecfcd704dd\" (UID: \"027d4639-edeb-422d-b5c5-f8ecfcd704dd\") " Dec 05 23:37:14 crc kubenswrapper[4734]: I1205 23:37:14.799882 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfnxd\" (UniqueName: \"kubernetes.io/projected/027d4639-edeb-422d-b5c5-f8ecfcd704dd-kube-api-access-gfnxd\") pod \"027d4639-edeb-422d-b5c5-f8ecfcd704dd\" (UID: \"027d4639-edeb-422d-b5c5-f8ecfcd704dd\") " Dec 05 23:37:14 crc kubenswrapper[4734]: I1205 23:37:14.799972 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2msr\" (UniqueName: \"kubernetes.io/projected/9614c3a8-524e-4641-9abb-a991a9c884ae-kube-api-access-n2msr\") pod \"9614c3a8-524e-4641-9abb-a991a9c884ae\" (UID: \"9614c3a8-524e-4641-9abb-a991a9c884ae\") " Dec 05 23:37:14 crc kubenswrapper[4734]: I1205 23:37:14.800107 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9614c3a8-524e-4641-9abb-a991a9c884ae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9614c3a8-524e-4641-9abb-a991a9c884ae" (UID: "9614c3a8-524e-4641-9abb-a991a9c884ae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:37:14 crc kubenswrapper[4734]: I1205 23:37:14.800520 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/027d4639-edeb-422d-b5c5-f8ecfcd704dd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "027d4639-edeb-422d-b5c5-f8ecfcd704dd" (UID: "027d4639-edeb-422d-b5c5-f8ecfcd704dd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:37:14 crc kubenswrapper[4734]: I1205 23:37:14.800722 4734 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9614c3a8-524e-4641-9abb-a991a9c884ae-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:14 crc kubenswrapper[4734]: I1205 23:37:14.807017 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/027d4639-edeb-422d-b5c5-f8ecfcd704dd-kube-api-access-gfnxd" (OuterVolumeSpecName: "kube-api-access-gfnxd") pod "027d4639-edeb-422d-b5c5-f8ecfcd704dd" (UID: "027d4639-edeb-422d-b5c5-f8ecfcd704dd"). InnerVolumeSpecName "kube-api-access-gfnxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:37:14 crc kubenswrapper[4734]: I1205 23:37:14.807166 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9614c3a8-524e-4641-9abb-a991a9c884ae-kube-api-access-n2msr" (OuterVolumeSpecName: "kube-api-access-n2msr") pod "9614c3a8-524e-4641-9abb-a991a9c884ae" (UID: "9614c3a8-524e-4641-9abb-a991a9c884ae"). InnerVolumeSpecName "kube-api-access-n2msr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:37:14 crc kubenswrapper[4734]: I1205 23:37:14.903023 4734 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/027d4639-edeb-422d-b5c5-f8ecfcd704dd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:14 crc kubenswrapper[4734]: I1205 23:37:14.903067 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfnxd\" (UniqueName: \"kubernetes.io/projected/027d4639-edeb-422d-b5c5-f8ecfcd704dd-kube-api-access-gfnxd\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:14 crc kubenswrapper[4734]: I1205 23:37:14.903079 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2msr\" (UniqueName: \"kubernetes.io/projected/9614c3a8-524e-4641-9abb-a991a9c884ae-kube-api-access-n2msr\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:15 crc kubenswrapper[4734]: I1205 23:37:15.022085 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 05 23:37:15 crc kubenswrapper[4734]: I1205 23:37:15.106798 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-etc-swift\") pod \"swift-storage-0\" (UID: \"fea25d07-8cbc-4875-89e8-1752b0ee2a9e\") " pod="openstack/swift-storage-0" Dec 05 23:37:15 crc kubenswrapper[4734]: E1205 23:37:15.106920 4734 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 23:37:15 crc kubenswrapper[4734]: E1205 23:37:15.109389 4734 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 23:37:15 crc kubenswrapper[4734]: E1205 23:37:15.109446 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-etc-swift podName:fea25d07-8cbc-4875-89e8-1752b0ee2a9e nodeName:}" failed. No retries permitted until 2025-12-05 23:37:23.109426244 +0000 UTC m=+1063.792830520 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-etc-swift") pod "swift-storage-0" (UID: "fea25d07-8cbc-4875-89e8-1752b0ee2a9e") : configmap "swift-ring-files" not found Dec 05 23:37:15 crc kubenswrapper[4734]: I1205 23:37:15.580695 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c35eaa12-d993-4769-975b-35a5ac6609e0","Type":"ContainerStarted","Data":"05f9dd5d16e65b3b556d3d6b976b02a62a361e9e97c167b7525d95c4d23713ac"} Dec 05 23:37:15 crc kubenswrapper[4734]: I1205 23:37:15.582128 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 05 23:37:15 crc kubenswrapper[4734]: I1205 23:37:15.586345 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ed95027c-1ded-4127-a341-7ee81018d4b6","Type":"ContainerStarted","Data":"e568c4fef629437d46890910627d0796cf4be63170bfe6ba198f3522a7208650"} Dec 05 23:37:15 crc kubenswrapper[4734]: I1205 23:37:15.586697 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:37:15 crc kubenswrapper[4734]: I1205 23:37:15.589568 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qdl57" event={"ID":"a1e03821-b44b-4ce9-8fb9-6831bf8b087f","Type":"ContainerStarted","Data":"53307a805cb48b051f9646b7dd535a0c872ce431e6f5c31679621c480f2ed5a4"} Dec 05 23:37:15 crc kubenswrapper[4734]: I1205 23:37:15.589629 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-351c-account-create-update-npg5f" Dec 05 23:37:15 crc kubenswrapper[4734]: I1205 23:37:15.589732 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-s269z" Dec 05 23:37:15 crc kubenswrapper[4734]: I1205 23:37:15.626069 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371970.228739 podStartE2EDuration="1m6.626036666s" podCreationTimestamp="2025-12-05 23:36:09 +0000 UTC" firstStartedPulling="2025-12-05 23:36:11.809656193 +0000 UTC m=+992.493060469" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:37:15.613122594 +0000 UTC m=+1056.296526870" watchObservedRunningTime="2025-12-05 23:37:15.626036666 +0000 UTC m=+1056.309440942" Dec 05 23:37:15 crc kubenswrapper[4734]: I1205 23:37:15.653508 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.967144814 podStartE2EDuration="1m5.65347382s" podCreationTimestamp="2025-12-05 23:36:10 +0000 UTC" firstStartedPulling="2025-12-05 23:36:12.238405997 +0000 UTC m=+992.921810273" lastFinishedPulling="2025-12-05 23:36:39.924735003 +0000 UTC m=+1020.608139279" observedRunningTime="2025-12-05 23:37:15.648163052 +0000 UTC m=+1056.331567348" watchObservedRunningTime="2025-12-05 23:37:15.65347382 +0000 UTC m=+1056.336878096" Dec 05 23:37:15 crc kubenswrapper[4734]: I1205 23:37:15.674276 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-qdl57" podStartSLOduration=2.886734095 podStartE2EDuration="8.674252954s" podCreationTimestamp="2025-12-05 23:37:07 +0000 UTC" firstStartedPulling="2025-12-05 23:37:08.7159992 +0000 UTC m=+1049.399403476" lastFinishedPulling="2025-12-05 23:37:14.503518059 +0000 UTC m=+1055.186922335" observedRunningTime="2025-12-05 23:37:15.671032656 +0000 UTC m=+1056.354436932" watchObservedRunningTime="2025-12-05 23:37:15.674252954 +0000 UTC m=+1056.357657230" Dec 05 23:37:16 crc kubenswrapper[4734]: I1205 23:37:16.355675 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-l7vfg" Dec 05 23:37:16 crc kubenswrapper[4734]: I1205 23:37:16.532273 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-c9btw"] Dec 05 23:37:16 crc kubenswrapper[4734]: I1205 23:37:16.532703 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-c9btw" podUID="115aa742-8de4-4cb2-84e2-c05f698eda5e" containerName="dnsmasq-dns" containerID="cri-o://77a7fadb2a77e908c44d03871ac6d355c03e2cdf50b1dd02851450d8cf30131d" gracePeriod=10 Dec 05 23:37:17 crc kubenswrapper[4734]: I1205 23:37:17.450079 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-c9btw" Dec 05 23:37:17 crc kubenswrapper[4734]: I1205 23:37:17.583459 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn9bj\" (UniqueName: \"kubernetes.io/projected/115aa742-8de4-4cb2-84e2-c05f698eda5e-kube-api-access-vn9bj\") pod \"115aa742-8de4-4cb2-84e2-c05f698eda5e\" (UID: \"115aa742-8de4-4cb2-84e2-c05f698eda5e\") " Dec 05 23:37:17 crc kubenswrapper[4734]: I1205 23:37:17.584895 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/115aa742-8de4-4cb2-84e2-c05f698eda5e-config\") pod \"115aa742-8de4-4cb2-84e2-c05f698eda5e\" (UID: \"115aa742-8de4-4cb2-84e2-c05f698eda5e\") " Dec 05 23:37:17 crc kubenswrapper[4734]: I1205 23:37:17.585079 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/115aa742-8de4-4cb2-84e2-c05f698eda5e-ovsdbserver-sb\") pod \"115aa742-8de4-4cb2-84e2-c05f698eda5e\" (UID: \"115aa742-8de4-4cb2-84e2-c05f698eda5e\") " Dec 05 23:37:17 crc kubenswrapper[4734]: I1205 23:37:17.585122 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/115aa742-8de4-4cb2-84e2-c05f698eda5e-ovsdbserver-nb\") pod \"115aa742-8de4-4cb2-84e2-c05f698eda5e\" (UID: \"115aa742-8de4-4cb2-84e2-c05f698eda5e\") " Dec 05 23:37:17 crc kubenswrapper[4734]: I1205 23:37:17.585195 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/115aa742-8de4-4cb2-84e2-c05f698eda5e-dns-svc\") pod \"115aa742-8de4-4cb2-84e2-c05f698eda5e\" (UID: \"115aa742-8de4-4cb2-84e2-c05f698eda5e\") " Dec 05 23:37:17 crc kubenswrapper[4734]: I1205 23:37:17.595806 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/115aa742-8de4-4cb2-84e2-c05f698eda5e-kube-api-access-vn9bj" (OuterVolumeSpecName: "kube-api-access-vn9bj") pod "115aa742-8de4-4cb2-84e2-c05f698eda5e" (UID: "115aa742-8de4-4cb2-84e2-c05f698eda5e"). InnerVolumeSpecName "kube-api-access-vn9bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:37:17 crc kubenswrapper[4734]: I1205 23:37:17.613135 4734 generic.go:334] "Generic (PLEG): container finished" podID="115aa742-8de4-4cb2-84e2-c05f698eda5e" containerID="77a7fadb2a77e908c44d03871ac6d355c03e2cdf50b1dd02851450d8cf30131d" exitCode=0 Dec 05 23:37:17 crc kubenswrapper[4734]: I1205 23:37:17.613188 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-c9btw" event={"ID":"115aa742-8de4-4cb2-84e2-c05f698eda5e","Type":"ContainerDied","Data":"77a7fadb2a77e908c44d03871ac6d355c03e2cdf50b1dd02851450d8cf30131d"} Dec 05 23:37:17 crc kubenswrapper[4734]: I1205 23:37:17.613228 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-c9btw" event={"ID":"115aa742-8de4-4cb2-84e2-c05f698eda5e","Type":"ContainerDied","Data":"46761160ea59dc9ae2cd9b52e44fc6da30478bf985ed4e0b77110af8b99bf010"} Dec 05 23:37:17 crc kubenswrapper[4734]: I1205 23:37:17.613252 4734 scope.go:117] "RemoveContainer" containerID="77a7fadb2a77e908c44d03871ac6d355c03e2cdf50b1dd02851450d8cf30131d" Dec 05 23:37:17 crc kubenswrapper[4734]: I1205 23:37:17.613411 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-c9btw" Dec 05 23:37:17 crc kubenswrapper[4734]: I1205 23:37:17.670061 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/115aa742-8de4-4cb2-84e2-c05f698eda5e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "115aa742-8de4-4cb2-84e2-c05f698eda5e" (UID: "115aa742-8de4-4cb2-84e2-c05f698eda5e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:37:17 crc kubenswrapper[4734]: I1205 23:37:17.685097 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/115aa742-8de4-4cb2-84e2-c05f698eda5e-config" (OuterVolumeSpecName: "config") pod "115aa742-8de4-4cb2-84e2-c05f698eda5e" (UID: "115aa742-8de4-4cb2-84e2-c05f698eda5e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:37:17 crc kubenswrapper[4734]: I1205 23:37:17.811671 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/115aa742-8de4-4cb2-84e2-c05f698eda5e-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:17 crc kubenswrapper[4734]: I1205 23:37:17.811716 4734 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/115aa742-8de4-4cb2-84e2-c05f698eda5e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:17 crc kubenswrapper[4734]: I1205 23:37:17.811727 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vn9bj\" (UniqueName: \"kubernetes.io/projected/115aa742-8de4-4cb2-84e2-c05f698eda5e-kube-api-access-vn9bj\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:17 crc kubenswrapper[4734]: I1205 23:37:17.891933 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/115aa742-8de4-4cb2-84e2-c05f698eda5e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "115aa742-8de4-4cb2-84e2-c05f698eda5e" (UID: "115aa742-8de4-4cb2-84e2-c05f698eda5e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:37:17 crc kubenswrapper[4734]: I1205 23:37:17.909691 4734 scope.go:117] "RemoveContainer" containerID="b240d5852fcb6123296ac2c6a6b3368ebcb90e5964721081cba3196355f06d85" Dec 05 23:37:18 crc kubenswrapper[4734]: I1205 23:37:18.021207 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/115aa742-8de4-4cb2-84e2-c05f698eda5e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "115aa742-8de4-4cb2-84e2-c05f698eda5e" (UID: "115aa742-8de4-4cb2-84e2-c05f698eda5e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:37:18 crc kubenswrapper[4734]: I1205 23:37:18.022081 4734 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/115aa742-8de4-4cb2-84e2-c05f698eda5e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:18 crc kubenswrapper[4734]: I1205 23:37:18.022124 4734 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/115aa742-8de4-4cb2-84e2-c05f698eda5e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:18 crc kubenswrapper[4734]: I1205 23:37:18.279633 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-c9btw"] Dec 05 23:37:18 crc kubenswrapper[4734]: I1205 23:37:18.286399 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-c9btw"] Dec 05 23:37:18 crc kubenswrapper[4734]: I1205 23:37:18.316788 4734 scope.go:117] "RemoveContainer" containerID="77a7fadb2a77e908c44d03871ac6d355c03e2cdf50b1dd02851450d8cf30131d" Dec 05 23:37:18 crc kubenswrapper[4734]: E1205 23:37:18.317350 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77a7fadb2a77e908c44d03871ac6d355c03e2cdf50b1dd02851450d8cf30131d\": container with ID starting with 77a7fadb2a77e908c44d03871ac6d355c03e2cdf50b1dd02851450d8cf30131d not found: ID does not exist" containerID="77a7fadb2a77e908c44d03871ac6d355c03e2cdf50b1dd02851450d8cf30131d" Dec 05 23:37:18 crc kubenswrapper[4734]: I1205 23:37:18.317422 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77a7fadb2a77e908c44d03871ac6d355c03e2cdf50b1dd02851450d8cf30131d"} err="failed to get container status \"77a7fadb2a77e908c44d03871ac6d355c03e2cdf50b1dd02851450d8cf30131d\": rpc error: code = NotFound desc = could not find container \"77a7fadb2a77e908c44d03871ac6d355c03e2cdf50b1dd02851450d8cf30131d\": container with ID starting with 77a7fadb2a77e908c44d03871ac6d355c03e2cdf50b1dd02851450d8cf30131d not found: ID does not exist" Dec 05 23:37:18 crc kubenswrapper[4734]: I1205 23:37:18.317462 4734 scope.go:117] "RemoveContainer" containerID="b240d5852fcb6123296ac2c6a6b3368ebcb90e5964721081cba3196355f06d85" Dec 05 23:37:18 crc kubenswrapper[4734]: E1205 23:37:18.318076 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b240d5852fcb6123296ac2c6a6b3368ebcb90e5964721081cba3196355f06d85\": container with ID starting with b240d5852fcb6123296ac2c6a6b3368ebcb90e5964721081cba3196355f06d85 not found: ID does not exist" containerID="b240d5852fcb6123296ac2c6a6b3368ebcb90e5964721081cba3196355f06d85" Dec 05 23:37:18 crc kubenswrapper[4734]: I1205 23:37:18.318127 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b240d5852fcb6123296ac2c6a6b3368ebcb90e5964721081cba3196355f06d85"} err="failed to get container status \"b240d5852fcb6123296ac2c6a6b3368ebcb90e5964721081cba3196355f06d85\": rpc error: code = NotFound desc = could not find container \"b240d5852fcb6123296ac2c6a6b3368ebcb90e5964721081cba3196355f06d85\": container with ID starting with b240d5852fcb6123296ac2c6a6b3368ebcb90e5964721081cba3196355f06d85 not found: ID does not exist" Dec 05 23:37:19 crc kubenswrapper[4734]: I1205 23:37:19.638634 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="115aa742-8de4-4cb2-84e2-c05f698eda5e" path="/var/lib/kubelet/pods/115aa742-8de4-4cb2-84e2-c05f698eda5e/volumes" Dec 05 23:37:19 crc kubenswrapper[4734]: I1205 23:37:19.969692 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-wzn74"] Dec 05 23:37:19 crc kubenswrapper[4734]: E1205 23:37:19.970426 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="115aa742-8de4-4cb2-84e2-c05f698eda5e" containerName="dnsmasq-dns" Dec 05 23:37:19 crc kubenswrapper[4734]: I1205 23:37:19.970441 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="115aa742-8de4-4cb2-84e2-c05f698eda5e" containerName="dnsmasq-dns" Dec 05 23:37:19 crc kubenswrapper[4734]: E1205 23:37:19.970456 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="027d4639-edeb-422d-b5c5-f8ecfcd704dd" containerName="mariadb-account-create-update" Dec 05 23:37:19 crc kubenswrapper[4734]: I1205 23:37:19.970463 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="027d4639-edeb-422d-b5c5-f8ecfcd704dd" containerName="mariadb-account-create-update" Dec 05 23:37:19 crc kubenswrapper[4734]: E1205 23:37:19.970487 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9614c3a8-524e-4641-9abb-a991a9c884ae" containerName="mariadb-database-create" Dec 05 23:37:19 crc kubenswrapper[4734]: I1205 23:37:19.970494 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="9614c3a8-524e-4641-9abb-a991a9c884ae" containerName="mariadb-database-create" Dec 05 23:37:19 crc kubenswrapper[4734]: E1205 23:37:19.970502 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="115aa742-8de4-4cb2-84e2-c05f698eda5e" containerName="init" Dec 05 23:37:19 crc kubenswrapper[4734]: I1205 23:37:19.970509 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="115aa742-8de4-4cb2-84e2-c05f698eda5e" containerName="init" Dec 05 23:37:19 crc kubenswrapper[4734]: I1205 23:37:19.970692 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="9614c3a8-524e-4641-9abb-a991a9c884ae" containerName="mariadb-database-create" Dec 05 23:37:19 crc kubenswrapper[4734]: I1205 23:37:19.970718 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="115aa742-8de4-4cb2-84e2-c05f698eda5e" containerName="dnsmasq-dns" Dec 05 23:37:19 crc kubenswrapper[4734]: I1205 23:37:19.970732 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="027d4639-edeb-422d-b5c5-f8ecfcd704dd" containerName="mariadb-account-create-update" Dec 05 23:37:19 crc kubenswrapper[4734]: I1205 23:37:19.971374 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wzn74" Dec 05 23:37:19 crc kubenswrapper[4734]: I1205 23:37:19.974252 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 05 23:37:19 crc kubenswrapper[4734]: I1205 23:37:19.974687 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-m6gpz" Dec 05 23:37:20 crc kubenswrapper[4734]: I1205 23:37:20.000845 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wzn74"] Dec 05 23:37:20 crc kubenswrapper[4734]: I1205 23:37:20.084054 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/741e9328-bc42-4fae-b3dd-316f3286fa42-db-sync-config-data\") pod \"glance-db-sync-wzn74\" (UID: \"741e9328-bc42-4fae-b3dd-316f3286fa42\") " pod="openstack/glance-db-sync-wzn74" Dec 05 23:37:20 crc kubenswrapper[4734]: I1205 23:37:20.084126 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/741e9328-bc42-4fae-b3dd-316f3286fa42-combined-ca-bundle\") pod \"glance-db-sync-wzn74\" (UID: \"741e9328-bc42-4fae-b3dd-316f3286fa42\") " pod="openstack/glance-db-sync-wzn74" Dec 05 23:37:20 crc kubenswrapper[4734]: I1205 23:37:20.084207 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtncl\" (UniqueName: \"kubernetes.io/projected/741e9328-bc42-4fae-b3dd-316f3286fa42-kube-api-access-wtncl\") pod \"glance-db-sync-wzn74\" (UID: \"741e9328-bc42-4fae-b3dd-316f3286fa42\") " pod="openstack/glance-db-sync-wzn74" Dec 05 23:37:20 crc kubenswrapper[4734]: I1205 23:37:20.084230 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/741e9328-bc42-4fae-b3dd-316f3286fa42-config-data\") pod \"glance-db-sync-wzn74\" (UID: \"741e9328-bc42-4fae-b3dd-316f3286fa42\") " pod="openstack/glance-db-sync-wzn74" Dec 05 23:37:20 crc kubenswrapper[4734]: I1205 23:37:20.186305 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/741e9328-bc42-4fae-b3dd-316f3286fa42-db-sync-config-data\") pod \"glance-db-sync-wzn74\" (UID: \"741e9328-bc42-4fae-b3dd-316f3286fa42\") " pod="openstack/glance-db-sync-wzn74" Dec 05 23:37:20 crc kubenswrapper[4734]: I1205 23:37:20.186392 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/741e9328-bc42-4fae-b3dd-316f3286fa42-combined-ca-bundle\") pod \"glance-db-sync-wzn74\" (UID: \"741e9328-bc42-4fae-b3dd-316f3286fa42\") " pod="openstack/glance-db-sync-wzn74" Dec 05 23:37:20 crc kubenswrapper[4734]: I1205 23:37:20.186480 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/741e9328-bc42-4fae-b3dd-316f3286fa42-config-data\") pod \"glance-db-sync-wzn74\" (UID: \"741e9328-bc42-4fae-b3dd-316f3286fa42\") " pod="openstack/glance-db-sync-wzn74" Dec 05 23:37:20 crc kubenswrapper[4734]: I1205 23:37:20.186507 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtncl\" (UniqueName: \"kubernetes.io/projected/741e9328-bc42-4fae-b3dd-316f3286fa42-kube-api-access-wtncl\") pod \"glance-db-sync-wzn74\" (UID: \"741e9328-bc42-4fae-b3dd-316f3286fa42\") " pod="openstack/glance-db-sync-wzn74" Dec 05 23:37:20 crc kubenswrapper[4734]: I1205 23:37:20.194995 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/741e9328-bc42-4fae-b3dd-316f3286fa42-combined-ca-bundle\") pod \"glance-db-sync-wzn74\" (UID: \"741e9328-bc42-4fae-b3dd-316f3286fa42\") " pod="openstack/glance-db-sync-wzn74" Dec 05 23:37:20 crc kubenswrapper[4734]: I1205 23:37:20.197149 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/741e9328-bc42-4fae-b3dd-316f3286fa42-config-data\") pod \"glance-db-sync-wzn74\" (UID: \"741e9328-bc42-4fae-b3dd-316f3286fa42\") " pod="openstack/glance-db-sync-wzn74" Dec 05 23:37:20 crc kubenswrapper[4734]: I1205 23:37:20.214493 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtncl\" (UniqueName: \"kubernetes.io/projected/741e9328-bc42-4fae-b3dd-316f3286fa42-kube-api-access-wtncl\") pod \"glance-db-sync-wzn74\" (UID: \"741e9328-bc42-4fae-b3dd-316f3286fa42\") " pod="openstack/glance-db-sync-wzn74" Dec 05 23:37:20 crc kubenswrapper[4734]: I1205 23:37:20.216280 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/741e9328-bc42-4fae-b3dd-316f3286fa42-db-sync-config-data\") pod \"glance-db-sync-wzn74\" (UID: \"741e9328-bc42-4fae-b3dd-316f3286fa42\") " pod="openstack/glance-db-sync-wzn74" Dec 05 23:37:20 crc kubenswrapper[4734]: I1205 23:37:20.296293 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wzn74" Dec 05 23:37:20 crc kubenswrapper[4734]: I1205 23:37:20.964919 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-587wk" podUID="625f2253-5867-4d61-a436-264a79c0bd94" containerName="ovn-controller" probeResult="failure" output=< Dec 05 23:37:20 crc kubenswrapper[4734]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 05 23:37:20 crc kubenswrapper[4734]: > Dec 05 23:37:21 crc kubenswrapper[4734]: W1205 23:37:21.035873 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod741e9328_bc42_4fae_b3dd_316f3286fa42.slice/crio-84e71b77f1960bcfad7eadc60bc13169af152e4b033dc448bdc6009bfeb04856 WatchSource:0}: Error finding container 84e71b77f1960bcfad7eadc60bc13169af152e4b033dc448bdc6009bfeb04856: Status 404 returned error can't find the container with id 84e71b77f1960bcfad7eadc60bc13169af152e4b033dc448bdc6009bfeb04856 Dec 05 23:37:21 crc kubenswrapper[4734]: I1205 23:37:21.050604 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wzn74"] Dec 05 23:37:21 crc kubenswrapper[4734]: I1205 23:37:21.668323 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wzn74" event={"ID":"741e9328-bc42-4fae-b3dd-316f3286fa42","Type":"ContainerStarted","Data":"84e71b77f1960bcfad7eadc60bc13169af152e4b033dc448bdc6009bfeb04856"} Dec 05 23:37:23 crc kubenswrapper[4734]: I1205 23:37:23.140297 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-etc-swift\") pod \"swift-storage-0\" (UID: \"fea25d07-8cbc-4875-89e8-1752b0ee2a9e\") " pod="openstack/swift-storage-0" Dec 05 23:37:23 crc kubenswrapper[4734]: E1205 23:37:23.140565 4734 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 23:37:23 crc kubenswrapper[4734]: E1205 23:37:23.140584 4734 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 23:37:23 crc kubenswrapper[4734]: E1205 23:37:23.140645 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-etc-swift podName:fea25d07-8cbc-4875-89e8-1752b0ee2a9e nodeName:}" failed. No retries permitted until 2025-12-05 23:37:39.140625885 +0000 UTC m=+1079.824030161 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-etc-swift") pod "swift-storage-0" (UID: "fea25d07-8cbc-4875-89e8-1752b0ee2a9e") : configmap "swift-ring-files" not found Dec 05 23:37:25 crc kubenswrapper[4734]: I1205 23:37:25.987593 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-587wk" podUID="625f2253-5867-4d61-a436-264a79c0bd94" containerName="ovn-controller" probeResult="failure" output=< Dec 05 23:37:25 crc kubenswrapper[4734]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 05 23:37:25 crc kubenswrapper[4734]: > Dec 05 23:37:26 crc kubenswrapper[4734]: I1205 23:37:26.007787 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tpdrq" Dec 05 23:37:26 crc kubenswrapper[4734]: I1205 23:37:26.032290 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tpdrq" Dec 05 23:37:26 crc kubenswrapper[4734]: I1205 23:37:26.301348 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-587wk-config-tdff5"] Dec 05 23:37:26 crc kubenswrapper[4734]: I1205 23:37:26.303338 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-587wk-config-tdff5" Dec 05 23:37:26 crc kubenswrapper[4734]: I1205 23:37:26.308608 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 05 23:37:26 crc kubenswrapper[4734]: I1205 23:37:26.349333 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-587wk-config-tdff5"] Dec 05 23:37:26 crc kubenswrapper[4734]: I1205 23:37:26.406727 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8162a1e5-af9e-495c-926f-e83571a03720-var-log-ovn\") pod \"ovn-controller-587wk-config-tdff5\" (UID: \"8162a1e5-af9e-495c-926f-e83571a03720\") " pod="openstack/ovn-controller-587wk-config-tdff5" Dec 05 23:37:26 crc kubenswrapper[4734]: I1205 23:37:26.407016 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8162a1e5-af9e-495c-926f-e83571a03720-additional-scripts\") pod \"ovn-controller-587wk-config-tdff5\" (UID: \"8162a1e5-af9e-495c-926f-e83571a03720\") " pod="openstack/ovn-controller-587wk-config-tdff5" Dec 05 23:37:26 crc kubenswrapper[4734]: I1205 23:37:26.407091 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8162a1e5-af9e-495c-926f-e83571a03720-var-run\") pod \"ovn-controller-587wk-config-tdff5\" (UID: \"8162a1e5-af9e-495c-926f-e83571a03720\") " pod="openstack/ovn-controller-587wk-config-tdff5" Dec 05 23:37:26 crc kubenswrapper[4734]: I1205 23:37:26.407175 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8162a1e5-af9e-495c-926f-e83571a03720-var-run-ovn\") pod \"ovn-controller-587wk-config-tdff5\" (UID: \"8162a1e5-af9e-495c-926f-e83571a03720\") " pod="openstack/ovn-controller-587wk-config-tdff5" Dec 05 23:37:26 crc kubenswrapper[4734]: I1205 23:37:26.407300 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgv6v\" (UniqueName: \"kubernetes.io/projected/8162a1e5-af9e-495c-926f-e83571a03720-kube-api-access-zgv6v\") pod \"ovn-controller-587wk-config-tdff5\" (UID: \"8162a1e5-af9e-495c-926f-e83571a03720\") " pod="openstack/ovn-controller-587wk-config-tdff5" Dec 05 23:37:26 crc kubenswrapper[4734]: I1205 23:37:26.407401 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8162a1e5-af9e-495c-926f-e83571a03720-scripts\") pod \"ovn-controller-587wk-config-tdff5\" (UID: \"8162a1e5-af9e-495c-926f-e83571a03720\") " pod="openstack/ovn-controller-587wk-config-tdff5" Dec 05 23:37:26 crc kubenswrapper[4734]: I1205 23:37:26.508961 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8162a1e5-af9e-495c-926f-e83571a03720-var-log-ovn\") pod \"ovn-controller-587wk-config-tdff5\" (UID: \"8162a1e5-af9e-495c-926f-e83571a03720\") " pod="openstack/ovn-controller-587wk-config-tdff5" Dec 05 23:37:26 crc kubenswrapper[4734]: I1205 23:37:26.509350 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8162a1e5-af9e-495c-926f-e83571a03720-additional-scripts\") pod \"ovn-controller-587wk-config-tdff5\" (UID: \"8162a1e5-af9e-495c-926f-e83571a03720\") " pod="openstack/ovn-controller-587wk-config-tdff5" Dec 05 23:37:26 crc kubenswrapper[4734]: I1205 23:37:26.509445 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8162a1e5-af9e-495c-926f-e83571a03720-var-run\") pod \"ovn-controller-587wk-config-tdff5\" (UID: \"8162a1e5-af9e-495c-926f-e83571a03720\") " pod="openstack/ovn-controller-587wk-config-tdff5" Dec 05 23:37:26 crc kubenswrapper[4734]: I1205 23:37:26.509600 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8162a1e5-af9e-495c-926f-e83571a03720-var-run-ovn\") pod \"ovn-controller-587wk-config-tdff5\" (UID: \"8162a1e5-af9e-495c-926f-e83571a03720\") " pod="openstack/ovn-controller-587wk-config-tdff5" Dec 05 23:37:26 crc kubenswrapper[4734]: I1205 23:37:26.509751 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgv6v\" (UniqueName: \"kubernetes.io/projected/8162a1e5-af9e-495c-926f-e83571a03720-kube-api-access-zgv6v\") pod \"ovn-controller-587wk-config-tdff5\" (UID: \"8162a1e5-af9e-495c-926f-e83571a03720\") " pod="openstack/ovn-controller-587wk-config-tdff5" Dec 05 23:37:26 crc kubenswrapper[4734]: I1205 23:37:26.509852 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8162a1e5-af9e-495c-926f-e83571a03720-scripts\") pod \"ovn-controller-587wk-config-tdff5\" (UID: \"8162a1e5-af9e-495c-926f-e83571a03720\") " pod="openstack/ovn-controller-587wk-config-tdff5" Dec 05 23:37:26 crc kubenswrapper[4734]: I1205 23:37:26.511403 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8162a1e5-af9e-495c-926f-e83571a03720-var-run\") pod \"ovn-controller-587wk-config-tdff5\" (UID: \"8162a1e5-af9e-495c-926f-e83571a03720\") " pod="openstack/ovn-controller-587wk-config-tdff5" Dec 05 23:37:26 crc kubenswrapper[4734]: I1205 23:37:26.514087 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8162a1e5-af9e-495c-926f-e83571a03720-var-log-ovn\") pod \"ovn-controller-587wk-config-tdff5\" (UID: \"8162a1e5-af9e-495c-926f-e83571a03720\") " pod="openstack/ovn-controller-587wk-config-tdff5" Dec 05 23:37:26 crc kubenswrapper[4734]: I1205 23:37:26.515101 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8162a1e5-af9e-495c-926f-e83571a03720-additional-scripts\") pod \"ovn-controller-587wk-config-tdff5\" (UID: \"8162a1e5-af9e-495c-926f-e83571a03720\") " pod="openstack/ovn-controller-587wk-config-tdff5" Dec 05 23:37:26 crc kubenswrapper[4734]: I1205 23:37:26.515162 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8162a1e5-af9e-495c-926f-e83571a03720-var-run-ovn\") pod \"ovn-controller-587wk-config-tdff5\" (UID: \"8162a1e5-af9e-495c-926f-e83571a03720\") " pod="openstack/ovn-controller-587wk-config-tdff5" Dec 05 23:37:26 crc kubenswrapper[4734]: I1205 23:37:26.529926 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8162a1e5-af9e-495c-926f-e83571a03720-scripts\") pod \"ovn-controller-587wk-config-tdff5\" (UID: \"8162a1e5-af9e-495c-926f-e83571a03720\") " pod="openstack/ovn-controller-587wk-config-tdff5" Dec 05 23:37:26 crc kubenswrapper[4734]: I1205 23:37:26.560568 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgv6v\" (UniqueName: \"kubernetes.io/projected/8162a1e5-af9e-495c-926f-e83571a03720-kube-api-access-zgv6v\") pod \"ovn-controller-587wk-config-tdff5\" (UID: \"8162a1e5-af9e-495c-926f-e83571a03720\") " pod="openstack/ovn-controller-587wk-config-tdff5" Dec 05 23:37:26 crc kubenswrapper[4734]: I1205 23:37:26.637292 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-587wk-config-tdff5" Dec 05 23:37:27 crc kubenswrapper[4734]: I1205 23:37:27.305960 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-587wk-config-tdff5"] Dec 05 23:37:27 crc kubenswrapper[4734]: I1205 23:37:27.776925 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-587wk-config-tdff5" event={"ID":"8162a1e5-af9e-495c-926f-e83571a03720","Type":"ContainerStarted","Data":"af8dc54646134df08947f581fbe33154b97a057f76899065313e09bd16c36ef1"} Dec 05 23:37:27 crc kubenswrapper[4734]: I1205 23:37:27.777965 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-587wk-config-tdff5" event={"ID":"8162a1e5-af9e-495c-926f-e83571a03720","Type":"ContainerStarted","Data":"a09a65d01869685eafb0b78e725400e80e85c43f95a4582d9bdded4cca40a79a"} Dec 05 23:37:27 crc kubenswrapper[4734]: I1205 23:37:27.781101 4734 generic.go:334] "Generic (PLEG): container finished" podID="a1e03821-b44b-4ce9-8fb9-6831bf8b087f" containerID="53307a805cb48b051f9646b7dd535a0c872ce431e6f5c31679621c480f2ed5a4" exitCode=0 Dec 05 23:37:27 crc kubenswrapper[4734]: I1205 23:37:27.781158 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qdl57" event={"ID":"a1e03821-b44b-4ce9-8fb9-6831bf8b087f","Type":"ContainerDied","Data":"53307a805cb48b051f9646b7dd535a0c872ce431e6f5c31679621c480f2ed5a4"} Dec 05 23:37:27 crc kubenswrapper[4734]: I1205 23:37:27.807730 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-587wk-config-tdff5" podStartSLOduration=1.807702758 podStartE2EDuration="1.807702758s" podCreationTimestamp="2025-12-05 23:37:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:37:27.800959254 +0000 UTC m=+1068.484363540" watchObservedRunningTime="2025-12-05 23:37:27.807702758 +0000 UTC m=+1068.491107034" Dec 05 23:37:28 crc kubenswrapper[4734]: I1205 23:37:28.797860 4734 generic.go:334] "Generic (PLEG): container finished" podID="8162a1e5-af9e-495c-926f-e83571a03720" containerID="af8dc54646134df08947f581fbe33154b97a057f76899065313e09bd16c36ef1" exitCode=0 Dec 05 23:37:28 crc kubenswrapper[4734]: I1205 23:37:28.798387 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-587wk-config-tdff5" event={"ID":"8162a1e5-af9e-495c-926f-e83571a03720","Type":"ContainerDied","Data":"af8dc54646134df08947f581fbe33154b97a057f76899065313e09bd16c36ef1"} Dec 05 23:37:31 crc kubenswrapper[4734]: I1205 23:37:31.215718 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 05 23:37:31 crc kubenswrapper[4734]: I1205 23:37:31.346386 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-587wk" Dec 05 23:37:31 crc kubenswrapper[4734]: I1205 23:37:31.463770 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:37:31 crc kubenswrapper[4734]: I1205 23:37:31.794994 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-s5w2r"] Dec 05 23:37:31 crc kubenswrapper[4734]: I1205 23:37:31.796565 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-s5w2r" Dec 05 23:37:31 crc kubenswrapper[4734]: I1205 23:37:31.814739 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-a7d4-account-create-update-6mgtp"] Dec 05 23:37:31 crc kubenswrapper[4734]: I1205 23:37:31.816327 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a7d4-account-create-update-6mgtp" Dec 05 23:37:31 crc kubenswrapper[4734]: I1205 23:37:31.825435 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-s5w2r"] Dec 05 23:37:31 crc kubenswrapper[4734]: I1205 23:37:31.828656 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 05 23:37:31 crc kubenswrapper[4734]: I1205 23:37:31.865815 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a7d4-account-create-update-6mgtp"] Dec 05 23:37:31 crc kubenswrapper[4734]: I1205 23:37:31.921673 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-bjk8x"] Dec 05 23:37:31 crc kubenswrapper[4734]: I1205 23:37:31.923601 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bjk8x" Dec 05 23:37:31 crc kubenswrapper[4734]: I1205 23:37:31.930755 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-e6fc-account-create-update-whgg9"] Dec 05 23:37:31 crc kubenswrapper[4734]: I1205 23:37:31.933158 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj62p\" (UniqueName: \"kubernetes.io/projected/9c2002ce-57e4-45bd-9110-9f7ebd50d0e7-kube-api-access-sj62p\") pod \"barbican-db-create-s5w2r\" (UID: \"9c2002ce-57e4-45bd-9110-9f7ebd50d0e7\") " pod="openstack/barbican-db-create-s5w2r" Dec 05 23:37:31 crc kubenswrapper[4734]: I1205 23:37:31.933243 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c2002ce-57e4-45bd-9110-9f7ebd50d0e7-operator-scripts\") pod \"barbican-db-create-s5w2r\" (UID: \"9c2002ce-57e4-45bd-9110-9f7ebd50d0e7\") " pod="openstack/barbican-db-create-s5w2r" Dec 05 23:37:31 crc kubenswrapper[4734]: I1205 23:37:31.933292 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe0296c8-8413-4c49-ae9c-e68e1dcbdb03-operator-scripts\") pod \"barbican-a7d4-account-create-update-6mgtp\" (UID: \"fe0296c8-8413-4c49-ae9c-e68e1dcbdb03\") " pod="openstack/barbican-a7d4-account-create-update-6mgtp" Dec 05 23:37:31 crc kubenswrapper[4734]: I1205 23:37:31.933331 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx45z\" (UniqueName: \"kubernetes.io/projected/fe0296c8-8413-4c49-ae9c-e68e1dcbdb03-kube-api-access-mx45z\") pod \"barbican-a7d4-account-create-update-6mgtp\" (UID: \"fe0296c8-8413-4c49-ae9c-e68e1dcbdb03\") " pod="openstack/barbican-a7d4-account-create-update-6mgtp" Dec 05 23:37:31 crc kubenswrapper[4734]: I1205 23:37:31.933567 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e6fc-account-create-update-whgg9" Dec 05 23:37:31 crc kubenswrapper[4734]: I1205 23:37:31.950695 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-bjk8x"] Dec 05 23:37:31 crc kubenswrapper[4734]: I1205 23:37:31.954433 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 05 23:37:31 crc kubenswrapper[4734]: I1205 23:37:31.966968 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e6fc-account-create-update-whgg9"] Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.034708 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwcq5\" (UniqueName: \"kubernetes.io/projected/8e04b323-6b27-4e58-9688-a7bc57317e6e-kube-api-access-pwcq5\") pod \"cinder-e6fc-account-create-update-whgg9\" (UID: \"8e04b323-6b27-4e58-9688-a7bc57317e6e\") " pod="openstack/cinder-e6fc-account-create-update-whgg9" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.034769 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj62p\" (UniqueName: \"kubernetes.io/projected/9c2002ce-57e4-45bd-9110-9f7ebd50d0e7-kube-api-access-sj62p\") pod \"barbican-db-create-s5w2r\" (UID: \"9c2002ce-57e4-45bd-9110-9f7ebd50d0e7\") " pod="openstack/barbican-db-create-s5w2r" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.034809 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c2002ce-57e4-45bd-9110-9f7ebd50d0e7-operator-scripts\") pod \"barbican-db-create-s5w2r\" (UID: \"9c2002ce-57e4-45bd-9110-9f7ebd50d0e7\") " pod="openstack/barbican-db-create-s5w2r" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.034845 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f3e2e20-cc04-41cd-94df-e0748036144a-operator-scripts\") pod \"cinder-db-create-bjk8x\" (UID: \"1f3e2e20-cc04-41cd-94df-e0748036144a\") " pod="openstack/cinder-db-create-bjk8x" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.034874 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe0296c8-8413-4c49-ae9c-e68e1dcbdb03-operator-scripts\") pod \"barbican-a7d4-account-create-update-6mgtp\" (UID: \"fe0296c8-8413-4c49-ae9c-e68e1dcbdb03\") " pod="openstack/barbican-a7d4-account-create-update-6mgtp" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.034913 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx45z\" (UniqueName: \"kubernetes.io/projected/fe0296c8-8413-4c49-ae9c-e68e1dcbdb03-kube-api-access-mx45z\") pod \"barbican-a7d4-account-create-update-6mgtp\" (UID: \"fe0296c8-8413-4c49-ae9c-e68e1dcbdb03\") " pod="openstack/barbican-a7d4-account-create-update-6mgtp" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.034944 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5qx2\" (UniqueName: \"kubernetes.io/projected/1f3e2e20-cc04-41cd-94df-e0748036144a-kube-api-access-r5qx2\") pod \"cinder-db-create-bjk8x\" (UID: \"1f3e2e20-cc04-41cd-94df-e0748036144a\") " pod="openstack/cinder-db-create-bjk8x" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.035005 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e04b323-6b27-4e58-9688-a7bc57317e6e-operator-scripts\") pod \"cinder-e6fc-account-create-update-whgg9\" (UID: \"8e04b323-6b27-4e58-9688-a7bc57317e6e\") " pod="openstack/cinder-e6fc-account-create-update-whgg9" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.036206 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c2002ce-57e4-45bd-9110-9f7ebd50d0e7-operator-scripts\") pod \"barbican-db-create-s5w2r\" (UID: \"9c2002ce-57e4-45bd-9110-9f7ebd50d0e7\") " pod="openstack/barbican-db-create-s5w2r" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.102754 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe0296c8-8413-4c49-ae9c-e68e1dcbdb03-operator-scripts\") pod \"barbican-a7d4-account-create-update-6mgtp\" (UID: \"fe0296c8-8413-4c49-ae9c-e68e1dcbdb03\") " pod="openstack/barbican-a7d4-account-create-update-6mgtp" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.136348 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e04b323-6b27-4e58-9688-a7bc57317e6e-operator-scripts\") pod \"cinder-e6fc-account-create-update-whgg9\" (UID: \"8e04b323-6b27-4e58-9688-a7bc57317e6e\") " pod="openstack/cinder-e6fc-account-create-update-whgg9" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.136865 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwcq5\" (UniqueName: \"kubernetes.io/projected/8e04b323-6b27-4e58-9688-a7bc57317e6e-kube-api-access-pwcq5\") pod \"cinder-e6fc-account-create-update-whgg9\" (UID: \"8e04b323-6b27-4e58-9688-a7bc57317e6e\") " pod="openstack/cinder-e6fc-account-create-update-whgg9" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.136930 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f3e2e20-cc04-41cd-94df-e0748036144a-operator-scripts\") pod \"cinder-db-create-bjk8x\" (UID: \"1f3e2e20-cc04-41cd-94df-e0748036144a\") " pod="openstack/cinder-db-create-bjk8x" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.137127 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5qx2\" (UniqueName: \"kubernetes.io/projected/1f3e2e20-cc04-41cd-94df-e0748036144a-kube-api-access-r5qx2\") pod \"cinder-db-create-bjk8x\" (UID: \"1f3e2e20-cc04-41cd-94df-e0748036144a\") " pod="openstack/cinder-db-create-bjk8x" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.137287 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e04b323-6b27-4e58-9688-a7bc57317e6e-operator-scripts\") pod \"cinder-e6fc-account-create-update-whgg9\" (UID: \"8e04b323-6b27-4e58-9688-a7bc57317e6e\") " pod="openstack/cinder-e6fc-account-create-update-whgg9" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.137894 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f3e2e20-cc04-41cd-94df-e0748036144a-operator-scripts\") pod \"cinder-db-create-bjk8x\" (UID: \"1f3e2e20-cc04-41cd-94df-e0748036144a\") " pod="openstack/cinder-db-create-bjk8x" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.186500 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx45z\" (UniqueName: \"kubernetes.io/projected/fe0296c8-8413-4c49-ae9c-e68e1dcbdb03-kube-api-access-mx45z\") pod \"barbican-a7d4-account-create-update-6mgtp\" (UID: \"fe0296c8-8413-4c49-ae9c-e68e1dcbdb03\") " pod="openstack/barbican-a7d4-account-create-update-6mgtp" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.203262 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj62p\" (UniqueName: \"kubernetes.io/projected/9c2002ce-57e4-45bd-9110-9f7ebd50d0e7-kube-api-access-sj62p\") pod \"barbican-db-create-s5w2r\" (UID: \"9c2002ce-57e4-45bd-9110-9f7ebd50d0e7\") " pod="openstack/barbican-db-create-s5w2r" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.203718 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5qx2\" (UniqueName: \"kubernetes.io/projected/1f3e2e20-cc04-41cd-94df-e0748036144a-kube-api-access-r5qx2\") pod \"cinder-db-create-bjk8x\" (UID: \"1f3e2e20-cc04-41cd-94df-e0748036144a\") " pod="openstack/cinder-db-create-bjk8x" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.209431 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwcq5\" (UniqueName: \"kubernetes.io/projected/8e04b323-6b27-4e58-9688-a7bc57317e6e-kube-api-access-pwcq5\") pod \"cinder-e6fc-account-create-update-whgg9\" (UID: \"8e04b323-6b27-4e58-9688-a7bc57317e6e\") " pod="openstack/cinder-e6fc-account-create-update-whgg9" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.242264 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-khd2f"] Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.244291 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-khd2f" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.249351 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.250927 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.251131 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.251453 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qf7c2" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.282890 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e6fc-account-create-update-whgg9" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.283760 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bjk8x" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.358988 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019f8649-b37e-4970-a742-33afa217a2b4-combined-ca-bundle\") pod \"keystone-db-sync-khd2f\" (UID: \"019f8649-b37e-4970-a742-33afa217a2b4\") " pod="openstack/keystone-db-sync-khd2f" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.359250 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/019f8649-b37e-4970-a742-33afa217a2b4-config-data\") pod \"keystone-db-sync-khd2f\" (UID: \"019f8649-b37e-4970-a742-33afa217a2b4\") " pod="openstack/keystone-db-sync-khd2f" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.359319 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc4gl\" (UniqueName: \"kubernetes.io/projected/019f8649-b37e-4970-a742-33afa217a2b4-kube-api-access-kc4gl\") pod \"keystone-db-sync-khd2f\" (UID: \"019f8649-b37e-4970-a742-33afa217a2b4\") " pod="openstack/keystone-db-sync-khd2f" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.437537 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-s5w2r" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.456477 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a7d4-account-create-update-6mgtp" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.463781 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/019f8649-b37e-4970-a742-33afa217a2b4-config-data\") pod \"keystone-db-sync-khd2f\" (UID: \"019f8649-b37e-4970-a742-33afa217a2b4\") " pod="openstack/keystone-db-sync-khd2f" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.463903 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc4gl\" (UniqueName: \"kubernetes.io/projected/019f8649-b37e-4970-a742-33afa217a2b4-kube-api-access-kc4gl\") pod \"keystone-db-sync-khd2f\" (UID: \"019f8649-b37e-4970-a742-33afa217a2b4\") " pod="openstack/keystone-db-sync-khd2f" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.463995 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019f8649-b37e-4970-a742-33afa217a2b4-combined-ca-bundle\") pod \"keystone-db-sync-khd2f\" (UID: \"019f8649-b37e-4970-a742-33afa217a2b4\") " pod="openstack/keystone-db-sync-khd2f" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.470329 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019f8649-b37e-4970-a742-33afa217a2b4-combined-ca-bundle\") pod \"keystone-db-sync-khd2f\" (UID: \"019f8649-b37e-4970-a742-33afa217a2b4\") " pod="openstack/keystone-db-sync-khd2f" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.472581 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/019f8649-b37e-4970-a742-33afa217a2b4-config-data\") pod \"keystone-db-sync-khd2f\" (UID: \"019f8649-b37e-4970-a742-33afa217a2b4\") " pod="openstack/keystone-db-sync-khd2f" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.508887 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-khd2f"] Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.520622 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc4gl\" (UniqueName: \"kubernetes.io/projected/019f8649-b37e-4970-a742-33afa217a2b4-kube-api-access-kc4gl\") pod \"keystone-db-sync-khd2f\" (UID: \"019f8649-b37e-4970-a742-33afa217a2b4\") " pod="openstack/keystone-db-sync-khd2f" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.537500 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-6zn8b"] Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.539761 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6zn8b" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.553771 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-6zn8b"] Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.597625 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-khd2f" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.641721 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f18c-account-create-update-qp7p8"] Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.643245 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f18c-account-create-update-qp7p8" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.646028 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.659083 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f18c-account-create-update-qp7p8"] Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.667984 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc8vc\" (UniqueName: \"kubernetes.io/projected/6f5039b6-0875-4dd9-a5c4-9e6849dbb221-kube-api-access-pc8vc\") pod \"neutron-db-create-6zn8b\" (UID: \"6f5039b6-0875-4dd9-a5c4-9e6849dbb221\") " pod="openstack/neutron-db-create-6zn8b" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.668069 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f5039b6-0875-4dd9-a5c4-9e6849dbb221-operator-scripts\") pod \"neutron-db-create-6zn8b\" (UID: \"6f5039b6-0875-4dd9-a5c4-9e6849dbb221\") " pod="openstack/neutron-db-create-6zn8b" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.770057 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26927571-4070-4160-9e19-82f06d7d2a06-operator-scripts\") pod \"neutron-f18c-account-create-update-qp7p8\" (UID: \"26927571-4070-4160-9e19-82f06d7d2a06\") " pod="openstack/neutron-f18c-account-create-update-qp7p8" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.770163 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc8vc\" (UniqueName: \"kubernetes.io/projected/6f5039b6-0875-4dd9-a5c4-9e6849dbb221-kube-api-access-pc8vc\") pod \"neutron-db-create-6zn8b\" (UID: \"6f5039b6-0875-4dd9-a5c4-9e6849dbb221\") " pod="openstack/neutron-db-create-6zn8b" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.770188 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f5039b6-0875-4dd9-a5c4-9e6849dbb221-operator-scripts\") pod \"neutron-db-create-6zn8b\" (UID: \"6f5039b6-0875-4dd9-a5c4-9e6849dbb221\") " pod="openstack/neutron-db-create-6zn8b" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.770230 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm6fn\" (UniqueName: \"kubernetes.io/projected/26927571-4070-4160-9e19-82f06d7d2a06-kube-api-access-nm6fn\") pod \"neutron-f18c-account-create-update-qp7p8\" (UID: \"26927571-4070-4160-9e19-82f06d7d2a06\") " pod="openstack/neutron-f18c-account-create-update-qp7p8" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.771102 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f5039b6-0875-4dd9-a5c4-9e6849dbb221-operator-scripts\") pod \"neutron-db-create-6zn8b\" (UID: \"6f5039b6-0875-4dd9-a5c4-9e6849dbb221\") " pod="openstack/neutron-db-create-6zn8b" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.798002 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc8vc\" (UniqueName: \"kubernetes.io/projected/6f5039b6-0875-4dd9-a5c4-9e6849dbb221-kube-api-access-pc8vc\") pod \"neutron-db-create-6zn8b\" (UID: \"6f5039b6-0875-4dd9-a5c4-9e6849dbb221\") " pod="openstack/neutron-db-create-6zn8b" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.897440 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6zn8b" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.898292 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26927571-4070-4160-9e19-82f06d7d2a06-operator-scripts\") pod \"neutron-f18c-account-create-update-qp7p8\" (UID: \"26927571-4070-4160-9e19-82f06d7d2a06\") " pod="openstack/neutron-f18c-account-create-update-qp7p8" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.898456 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm6fn\" (UniqueName: \"kubernetes.io/projected/26927571-4070-4160-9e19-82f06d7d2a06-kube-api-access-nm6fn\") pod \"neutron-f18c-account-create-update-qp7p8\" (UID: \"26927571-4070-4160-9e19-82f06d7d2a06\") " pod="openstack/neutron-f18c-account-create-update-qp7p8" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.899855 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26927571-4070-4160-9e19-82f06d7d2a06-operator-scripts\") pod \"neutron-f18c-account-create-update-qp7p8\" (UID: \"26927571-4070-4160-9e19-82f06d7d2a06\") " pod="openstack/neutron-f18c-account-create-update-qp7p8" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.924345 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm6fn\" (UniqueName: \"kubernetes.io/projected/26927571-4070-4160-9e19-82f06d7d2a06-kube-api-access-nm6fn\") pod \"neutron-f18c-account-create-update-qp7p8\" (UID: \"26927571-4070-4160-9e19-82f06d7d2a06\") " pod="openstack/neutron-f18c-account-create-update-qp7p8" Dec 05 23:37:32 crc kubenswrapper[4734]: I1205 23:37:32.966197 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f18c-account-create-update-qp7p8" Dec 05 23:37:39 crc kubenswrapper[4734]: I1205 23:37:39.178721 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-etc-swift\") pod \"swift-storage-0\" (UID: \"fea25d07-8cbc-4875-89e8-1752b0ee2a9e\") " pod="openstack/swift-storage-0" Dec 05 23:37:39 crc kubenswrapper[4734]: I1205 23:37:39.190012 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fea25d07-8cbc-4875-89e8-1752b0ee2a9e-etc-swift\") pod \"swift-storage-0\" (UID: \"fea25d07-8cbc-4875-89e8-1752b0ee2a9e\") " pod="openstack/swift-storage-0" Dec 05 23:37:39 crc kubenswrapper[4734]: I1205 23:37:39.231636 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 05 23:37:41 crc kubenswrapper[4734]: E1205 23:37:41.926318 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Dec 05 23:37:41 crc kubenswrapper[4734]: E1205 23:37:41.928220 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wtncl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-wzn74_openstack(741e9328-bc42-4fae-b3dd-316f3286fa42): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 23:37:41 crc kubenswrapper[4734]: E1205 23:37:41.929409 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-wzn74" podUID="741e9328-bc42-4fae-b3dd-316f3286fa42" Dec 05 23:37:41 crc kubenswrapper[4734]: I1205 23:37:41.940915 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qdl57" Dec 05 23:37:41 crc kubenswrapper[4734]: I1205 23:37:41.957582 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-587wk-config-tdff5" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.049719 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qdl57" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.050358 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qdl57" event={"ID":"a1e03821-b44b-4ce9-8fb9-6831bf8b087f","Type":"ContainerDied","Data":"64b309db10c29093707891d0d3136a34c9af13cd98b38b040ab4b50d69e25718"} Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.050405 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64b309db10c29093707891d0d3136a34c9af13cd98b38b040ab4b50d69e25718" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.053770 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-587wk-config-tdff5" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.053781 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-587wk-config-tdff5" event={"ID":"8162a1e5-af9e-495c-926f-e83571a03720","Type":"ContainerDied","Data":"a09a65d01869685eafb0b78e725400e80e85c43f95a4582d9bdded4cca40a79a"} Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.053862 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a09a65d01869685eafb0b78e725400e80e85c43f95a4582d9bdded4cca40a79a" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.067173 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgv6v\" (UniqueName: \"kubernetes.io/projected/8162a1e5-af9e-495c-926f-e83571a03720-kube-api-access-zgv6v\") pod \"8162a1e5-af9e-495c-926f-e83571a03720\" (UID: \"8162a1e5-af9e-495c-926f-e83571a03720\") " Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.067467 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-scripts\") pod \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\" (UID: \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\") " Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.067539 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-ring-data-devices\") pod \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\" (UID: \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\") " Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.067571 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8162a1e5-af9e-495c-926f-e83571a03720-var-log-ovn\") pod \"8162a1e5-af9e-495c-926f-e83571a03720\" (UID: \"8162a1e5-af9e-495c-926f-e83571a03720\") " Dec 05 23:37:42 crc kubenswrapper[4734]: E1205 23:37:42.071250 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-wzn74" podUID="741e9328-bc42-4fae-b3dd-316f3286fa42" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.074355 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8162a1e5-af9e-495c-926f-e83571a03720-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "8162a1e5-af9e-495c-926f-e83571a03720" (UID: "8162a1e5-af9e-495c-926f-e83571a03720"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.075355 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a1e03821-b44b-4ce9-8fb9-6831bf8b087f" (UID: "a1e03821-b44b-4ce9-8fb9-6831bf8b087f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.081707 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a1e03821-b44b-4ce9-8fb9-6831bf8b087f" (UID: "a1e03821-b44b-4ce9-8fb9-6831bf8b087f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.084344 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8162a1e5-af9e-495c-926f-e83571a03720-kube-api-access-zgv6v" (OuterVolumeSpecName: "kube-api-access-zgv6v") pod "8162a1e5-af9e-495c-926f-e83571a03720" (UID: "8162a1e5-af9e-495c-926f-e83571a03720"). InnerVolumeSpecName "kube-api-access-zgv6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.092739 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-etc-swift\") pod \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\" (UID: \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\") " Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.092866 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8162a1e5-af9e-495c-926f-e83571a03720-scripts\") pod \"8162a1e5-af9e-495c-926f-e83571a03720\" (UID: \"8162a1e5-af9e-495c-926f-e83571a03720\") " Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.092921 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-swiftconf\") pod \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\" (UID: \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\") " Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.093014 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8162a1e5-af9e-495c-926f-e83571a03720-additional-scripts\") pod \"8162a1e5-af9e-495c-926f-e83571a03720\" (UID: \"8162a1e5-af9e-495c-926f-e83571a03720\") " Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.093056 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-combined-ca-bundle\") pod \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\" (UID: \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\") " Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.093093 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ks6x\" (UniqueName: \"kubernetes.io/projected/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-kube-api-access-5ks6x\") pod \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\" (UID: \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\") " Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.093154 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8162a1e5-af9e-495c-926f-e83571a03720-var-run-ovn\") pod \"8162a1e5-af9e-495c-926f-e83571a03720\" (UID: \"8162a1e5-af9e-495c-926f-e83571a03720\") " Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.093182 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8162a1e5-af9e-495c-926f-e83571a03720-var-run\") pod \"8162a1e5-af9e-495c-926f-e83571a03720\" (UID: \"8162a1e5-af9e-495c-926f-e83571a03720\") " Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.093211 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-dispersionconf\") pod \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\" (UID: \"a1e03821-b44b-4ce9-8fb9-6831bf8b087f\") " Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.095981 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8162a1e5-af9e-495c-926f-e83571a03720-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "8162a1e5-af9e-495c-926f-e83571a03720" (UID: "8162a1e5-af9e-495c-926f-e83571a03720"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.096120 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8162a1e5-af9e-495c-926f-e83571a03720-var-run" (OuterVolumeSpecName: "var-run") pod "8162a1e5-af9e-495c-926f-e83571a03720" (UID: "8162a1e5-af9e-495c-926f-e83571a03720"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.097205 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8162a1e5-af9e-495c-926f-e83571a03720-scripts" (OuterVolumeSpecName: "scripts") pod "8162a1e5-af9e-495c-926f-e83571a03720" (UID: "8162a1e5-af9e-495c-926f-e83571a03720"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.097660 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8162a1e5-af9e-495c-926f-e83571a03720-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "8162a1e5-af9e-495c-926f-e83571a03720" (UID: "8162a1e5-af9e-495c-926f-e83571a03720"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.100951 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgv6v\" (UniqueName: \"kubernetes.io/projected/8162a1e5-af9e-495c-926f-e83571a03720-kube-api-access-zgv6v\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.100994 4734 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.101006 4734 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8162a1e5-af9e-495c-926f-e83571a03720-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.101018 4734 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.104190 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-kube-api-access-5ks6x" (OuterVolumeSpecName: "kube-api-access-5ks6x") pod "a1e03821-b44b-4ce9-8fb9-6831bf8b087f" (UID: "a1e03821-b44b-4ce9-8fb9-6831bf8b087f"). InnerVolumeSpecName "kube-api-access-5ks6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.107692 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a1e03821-b44b-4ce9-8fb9-6831bf8b087f" (UID: "a1e03821-b44b-4ce9-8fb9-6831bf8b087f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.159980 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-scripts" (OuterVolumeSpecName: "scripts") pod "a1e03821-b44b-4ce9-8fb9-6831bf8b087f" (UID: "a1e03821-b44b-4ce9-8fb9-6831bf8b087f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.184692 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1e03821-b44b-4ce9-8fb9-6831bf8b087f" (UID: "a1e03821-b44b-4ce9-8fb9-6831bf8b087f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.202724 4734 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8162a1e5-af9e-495c-926f-e83571a03720-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.202762 4734 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8162a1e5-af9e-495c-926f-e83571a03720-var-run\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.202774 4734 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.202786 4734 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.202798 4734 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8162a1e5-af9e-495c-926f-e83571a03720-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.202809 4734 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8162a1e5-af9e-495c-926f-e83571a03720-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.202821 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.202832 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ks6x\" (UniqueName: \"kubernetes.io/projected/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-kube-api-access-5ks6x\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.257947 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a1e03821-b44b-4ce9-8fb9-6831bf8b087f" (UID: "a1e03821-b44b-4ce9-8fb9-6831bf8b087f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.362871 4734 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a1e03821-b44b-4ce9-8fb9-6831bf8b087f-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.891804 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-khd2f"] Dec 05 23:37:42 crc kubenswrapper[4734]: W1205 23:37:42.908266 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod019f8649_b37e_4970_a742_33afa217a2b4.slice/crio-cb00d76f20f9c41bd3a668a07795293482c273004c765b77f84d9d0bfe4cd9db WatchSource:0}: Error finding container cb00d76f20f9c41bd3a668a07795293482c273004c765b77f84d9d0bfe4cd9db: Status 404 returned error can't find the container with id cb00d76f20f9c41bd3a668a07795293482c273004c765b77f84d9d0bfe4cd9db Dec 05 23:37:42 crc kubenswrapper[4734]: W1205 23:37:42.910506 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e04b323_6b27_4e58_9688_a7bc57317e6e.slice/crio-1e7ef7a970e395010cdf183b6fcef490f015ee501ef1c4be92d7a443f0061af4 WatchSource:0}: Error finding container 1e7ef7a970e395010cdf183b6fcef490f015ee501ef1c4be92d7a443f0061af4: Status 404 returned error can't find the container with id 1e7ef7a970e395010cdf183b6fcef490f015ee501ef1c4be92d7a443f0061af4 Dec 05 23:37:42 crc kubenswrapper[4734]: W1205 23:37:42.942990 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe0296c8_8413_4c49_ae9c_e68e1dcbdb03.slice/crio-ec6d095a4be2cf01b5138f866f3001bf3e354e13b406ee6105b63589d14cfacf WatchSource:0}: Error finding container ec6d095a4be2cf01b5138f866f3001bf3e354e13b406ee6105b63589d14cfacf: Status 404 returned error can't find the container with id ec6d095a4be2cf01b5138f866f3001bf3e354e13b406ee6105b63589d14cfacf Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.951179 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.955491 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e6fc-account-create-update-whgg9"] Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.963394 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a7d4-account-create-update-6mgtp"] Dec 05 23:37:42 crc kubenswrapper[4734]: I1205 23:37:42.967068 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 05 23:37:43 crc kubenswrapper[4734]: I1205 23:37:43.083284 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-bjk8x"] Dec 05 23:37:43 crc kubenswrapper[4734]: I1205 23:37:43.094456 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 05 23:37:43 crc kubenswrapper[4734]: W1205 23:37:43.096384 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f3e2e20_cc04_41cd_94df_e0748036144a.slice/crio-c547ac6e5fb0c4ea2415a88e4ea43485bd86ef49504fb84ae27ad0d34fb553a0 WatchSource:0}: Error finding container c547ac6e5fb0c4ea2415a88e4ea43485bd86ef49504fb84ae27ad0d34fb553a0: Status 404 returned error can't find the container with id c547ac6e5fb0c4ea2415a88e4ea43485bd86ef49504fb84ae27ad0d34fb553a0 Dec 05 23:37:43 crc kubenswrapper[4734]: I1205 23:37:43.117051 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-khd2f" event={"ID":"019f8649-b37e-4970-a742-33afa217a2b4","Type":"ContainerStarted","Data":"cb00d76f20f9c41bd3a668a07795293482c273004c765b77f84d9d0bfe4cd9db"} Dec 05 23:37:43 crc kubenswrapper[4734]: I1205 23:37:43.131581 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a7d4-account-create-update-6mgtp" event={"ID":"fe0296c8-8413-4c49-ae9c-e68e1dcbdb03","Type":"ContainerStarted","Data":"ec6d095a4be2cf01b5138f866f3001bf3e354e13b406ee6105b63589d14cfacf"} Dec 05 23:37:43 crc kubenswrapper[4734]: I1205 23:37:43.142107 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-6zn8b"] Dec 05 23:37:43 crc kubenswrapper[4734]: I1205 23:37:43.148590 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f18c-account-create-update-qp7p8"] Dec 05 23:37:43 crc kubenswrapper[4734]: I1205 23:37:43.159299 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e6fc-account-create-update-whgg9" event={"ID":"8e04b323-6b27-4e58-9688-a7bc57317e6e","Type":"ContainerStarted","Data":"1e7ef7a970e395010cdf183b6fcef490f015ee501ef1c4be92d7a443f0061af4"} Dec 05 23:37:43 crc kubenswrapper[4734]: W1205 23:37:43.163464 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f5039b6_0875_4dd9_a5c4_9e6849dbb221.slice/crio-bf4070ad550c15a0dd5494c58dbf5235b969ff019c891961e18562fa3e33ca18 WatchSource:0}: Error finding container bf4070ad550c15a0dd5494c58dbf5235b969ff019c891961e18562fa3e33ca18: Status 404 returned error can't find the container with id bf4070ad550c15a0dd5494c58dbf5235b969ff019c891961e18562fa3e33ca18 Dec 05 23:37:43 crc kubenswrapper[4734]: I1205 23:37:43.165513 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-587wk-config-tdff5"] Dec 05 23:37:43 crc kubenswrapper[4734]: I1205 23:37:43.180215 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-s5w2r"] Dec 05 23:37:43 crc kubenswrapper[4734]: I1205 23:37:43.189246 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-587wk-config-tdff5"] Dec 05 23:37:43 crc kubenswrapper[4734]: I1205 23:37:43.189824 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 05 23:37:43 crc kubenswrapper[4734]: I1205 23:37:43.625037 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8162a1e5-af9e-495c-926f-e83571a03720" path="/var/lib/kubelet/pods/8162a1e5-af9e-495c-926f-e83571a03720/volumes" Dec 05 23:37:44 crc kubenswrapper[4734]: I1205 23:37:44.184216 4734 generic.go:334] "Generic (PLEG): container finished" podID="6f5039b6-0875-4dd9-a5c4-9e6849dbb221" containerID="b5a3c97982fe4691690f8cfa50f1d8dc15b8c1d0aea8f1c1c43c4490cf284e5b" exitCode=0 Dec 05 23:37:44 crc kubenswrapper[4734]: I1205 23:37:44.184291 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6zn8b" event={"ID":"6f5039b6-0875-4dd9-a5c4-9e6849dbb221","Type":"ContainerDied","Data":"b5a3c97982fe4691690f8cfa50f1d8dc15b8c1d0aea8f1c1c43c4490cf284e5b"} Dec 05 23:37:44 crc kubenswrapper[4734]: I1205 23:37:44.184359 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6zn8b" event={"ID":"6f5039b6-0875-4dd9-a5c4-9e6849dbb221","Type":"ContainerStarted","Data":"bf4070ad550c15a0dd5494c58dbf5235b969ff019c891961e18562fa3e33ca18"} Dec 05 23:37:44 crc kubenswrapper[4734]: I1205 23:37:44.187117 4734 generic.go:334] "Generic (PLEG): container finished" podID="1f3e2e20-cc04-41cd-94df-e0748036144a" containerID="25ea3dd465d7374c67e296a8dbcfcb478d415f8e89230b9c95e81b2f9a1371ae" exitCode=0 Dec 05 23:37:44 crc kubenswrapper[4734]: I1205 23:37:44.187203 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bjk8x" event={"ID":"1f3e2e20-cc04-41cd-94df-e0748036144a","Type":"ContainerDied","Data":"25ea3dd465d7374c67e296a8dbcfcb478d415f8e89230b9c95e81b2f9a1371ae"} Dec 05 23:37:44 crc kubenswrapper[4734]: I1205 23:37:44.187222 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bjk8x" event={"ID":"1f3e2e20-cc04-41cd-94df-e0748036144a","Type":"ContainerStarted","Data":"c547ac6e5fb0c4ea2415a88e4ea43485bd86ef49504fb84ae27ad0d34fb553a0"} Dec 05 23:37:44 crc kubenswrapper[4734]: I1205 23:37:44.189319 4734 generic.go:334] "Generic (PLEG): container finished" podID="26927571-4070-4160-9e19-82f06d7d2a06" containerID="b97e73187b85ceeab5cfd09e4d53d4dbb45cabe0bfe6efb107a76fbc9b10d289" exitCode=0 Dec 05 23:37:44 crc kubenswrapper[4734]: I1205 23:37:44.189370 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f18c-account-create-update-qp7p8" event={"ID":"26927571-4070-4160-9e19-82f06d7d2a06","Type":"ContainerDied","Data":"b97e73187b85ceeab5cfd09e4d53d4dbb45cabe0bfe6efb107a76fbc9b10d289"} Dec 05 23:37:44 crc kubenswrapper[4734]: I1205 23:37:44.189386 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f18c-account-create-update-qp7p8" event={"ID":"26927571-4070-4160-9e19-82f06d7d2a06","Type":"ContainerStarted","Data":"af57c1d43c2f4f8519f8e34626f8766440f9cc8d09f4ed4e2e9c923bc493b5c1"} Dec 05 23:37:44 crc kubenswrapper[4734]: I1205 23:37:44.191666 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fea25d07-8cbc-4875-89e8-1752b0ee2a9e","Type":"ContainerStarted","Data":"7b1a944fc1433607538b1a6c27aa32cae8c9b81b5c64b1cdd03b9552b832840c"} Dec 05 23:37:44 crc kubenswrapper[4734]: I1205 23:37:44.193972 4734 generic.go:334] "Generic (PLEG): container finished" podID="fe0296c8-8413-4c49-ae9c-e68e1dcbdb03" containerID="a3f3439c1615c6066a0c7b7bb8ed0597b8885146363f6c94f8c1196d20ff839a" exitCode=0 Dec 05 23:37:44 crc kubenswrapper[4734]: I1205 23:37:44.194059 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a7d4-account-create-update-6mgtp" event={"ID":"fe0296c8-8413-4c49-ae9c-e68e1dcbdb03","Type":"ContainerDied","Data":"a3f3439c1615c6066a0c7b7bb8ed0597b8885146363f6c94f8c1196d20ff839a"} Dec 05 23:37:44 crc kubenswrapper[4734]: I1205 23:37:44.217835 4734 generic.go:334] "Generic (PLEG): container finished" podID="8e04b323-6b27-4e58-9688-a7bc57317e6e" containerID="e26cf08257b3b808b32026a3695581b91ee4cf7ccec4d0a1ebab2542ba48752b" exitCode=0 Dec 05 23:37:44 crc kubenswrapper[4734]: I1205 23:37:44.217916 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e6fc-account-create-update-whgg9" event={"ID":"8e04b323-6b27-4e58-9688-a7bc57317e6e","Type":"ContainerDied","Data":"e26cf08257b3b808b32026a3695581b91ee4cf7ccec4d0a1ebab2542ba48752b"} Dec 05 23:37:44 crc kubenswrapper[4734]: I1205 23:37:44.225449 4734 generic.go:334] "Generic (PLEG): container finished" podID="9c2002ce-57e4-45bd-9110-9f7ebd50d0e7" containerID="081b81ceb2eb2e2ee332d925d7eac6f20d10155f1a6a949b002cff09c0a0ded3" exitCode=0 Dec 05 23:37:44 crc kubenswrapper[4734]: I1205 23:37:44.225541 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-s5w2r" event={"ID":"9c2002ce-57e4-45bd-9110-9f7ebd50d0e7","Type":"ContainerDied","Data":"081b81ceb2eb2e2ee332d925d7eac6f20d10155f1a6a949b002cff09c0a0ded3"} Dec 05 23:37:44 crc kubenswrapper[4734]: I1205 23:37:44.225583 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-s5w2r" event={"ID":"9c2002ce-57e4-45bd-9110-9f7ebd50d0e7","Type":"ContainerStarted","Data":"53b4620166eef8ca83f671728689e77abceaa99c7420ec36c0c26d57760e5539"} Dec 05 23:37:46 crc kubenswrapper[4734]: I1205 23:37:46.261751 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fea25d07-8cbc-4875-89e8-1752b0ee2a9e","Type":"ContainerStarted","Data":"b83c252da57b4e25a51fc59e74458039fc618eb3831a2cceec042cad60e93f5c"} Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.216106 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bjk8x" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.247033 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a7d4-account-create-update-6mgtp" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.257047 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-s5w2r" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.293343 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e6fc-account-create-update-whgg9" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.299998 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6zn8b" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.306663 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f18c-account-create-update-qp7p8" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.310460 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5qx2\" (UniqueName: \"kubernetes.io/projected/1f3e2e20-cc04-41cd-94df-e0748036144a-kube-api-access-r5qx2\") pod \"1f3e2e20-cc04-41cd-94df-e0748036144a\" (UID: \"1f3e2e20-cc04-41cd-94df-e0748036144a\") " Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.310555 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f3e2e20-cc04-41cd-94df-e0748036144a-operator-scripts\") pod \"1f3e2e20-cc04-41cd-94df-e0748036144a\" (UID: \"1f3e2e20-cc04-41cd-94df-e0748036144a\") " Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.310608 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e04b323-6b27-4e58-9688-a7bc57317e6e-operator-scripts\") pod \"8e04b323-6b27-4e58-9688-a7bc57317e6e\" (UID: \"8e04b323-6b27-4e58-9688-a7bc57317e6e\") " Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.310639 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx45z\" (UniqueName: \"kubernetes.io/projected/fe0296c8-8413-4c49-ae9c-e68e1dcbdb03-kube-api-access-mx45z\") pod \"fe0296c8-8413-4c49-ae9c-e68e1dcbdb03\" (UID: \"fe0296c8-8413-4c49-ae9c-e68e1dcbdb03\") " Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.310698 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe0296c8-8413-4c49-ae9c-e68e1dcbdb03-operator-scripts\") pod \"fe0296c8-8413-4c49-ae9c-e68e1dcbdb03\" (UID: \"fe0296c8-8413-4c49-ae9c-e68e1dcbdb03\") " Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.310756 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwcq5\" (UniqueName: \"kubernetes.io/projected/8e04b323-6b27-4e58-9688-a7bc57317e6e-kube-api-access-pwcq5\") pod \"8e04b323-6b27-4e58-9688-a7bc57317e6e\" (UID: \"8e04b323-6b27-4e58-9688-a7bc57317e6e\") " Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.310797 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c2002ce-57e4-45bd-9110-9f7ebd50d0e7-operator-scripts\") pod \"9c2002ce-57e4-45bd-9110-9f7ebd50d0e7\" (UID: \"9c2002ce-57e4-45bd-9110-9f7ebd50d0e7\") " Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.310875 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj62p\" (UniqueName: \"kubernetes.io/projected/9c2002ce-57e4-45bd-9110-9f7ebd50d0e7-kube-api-access-sj62p\") pod \"9c2002ce-57e4-45bd-9110-9f7ebd50d0e7\" (UID: \"9c2002ce-57e4-45bd-9110-9f7ebd50d0e7\") " Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.311718 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f3e2e20-cc04-41cd-94df-e0748036144a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1f3e2e20-cc04-41cd-94df-e0748036144a" (UID: "1f3e2e20-cc04-41cd-94df-e0748036144a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.312282 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e04b323-6b27-4e58-9688-a7bc57317e6e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8e04b323-6b27-4e58-9688-a7bc57317e6e" (UID: "8e04b323-6b27-4e58-9688-a7bc57317e6e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.314110 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c2002ce-57e4-45bd-9110-9f7ebd50d0e7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9c2002ce-57e4-45bd-9110-9f7ebd50d0e7" (UID: "9c2002ce-57e4-45bd-9110-9f7ebd50d0e7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.314430 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe0296c8-8413-4c49-ae9c-e68e1dcbdb03-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fe0296c8-8413-4c49-ae9c-e68e1dcbdb03" (UID: "fe0296c8-8413-4c49-ae9c-e68e1dcbdb03"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.325515 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe0296c8-8413-4c49-ae9c-e68e1dcbdb03-kube-api-access-mx45z" (OuterVolumeSpecName: "kube-api-access-mx45z") pod "fe0296c8-8413-4c49-ae9c-e68e1dcbdb03" (UID: "fe0296c8-8413-4c49-ae9c-e68e1dcbdb03"). InnerVolumeSpecName "kube-api-access-mx45z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.325782 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e04b323-6b27-4e58-9688-a7bc57317e6e-kube-api-access-pwcq5" (OuterVolumeSpecName: "kube-api-access-pwcq5") pod "8e04b323-6b27-4e58-9688-a7bc57317e6e" (UID: "8e04b323-6b27-4e58-9688-a7bc57317e6e"). InnerVolumeSpecName "kube-api-access-pwcq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.328957 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c2002ce-57e4-45bd-9110-9f7ebd50d0e7-kube-api-access-sj62p" (OuterVolumeSpecName: "kube-api-access-sj62p") pod "9c2002ce-57e4-45bd-9110-9f7ebd50d0e7" (UID: "9c2002ce-57e4-45bd-9110-9f7ebd50d0e7"). InnerVolumeSpecName "kube-api-access-sj62p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.333091 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f3e2e20-cc04-41cd-94df-e0748036144a-kube-api-access-r5qx2" (OuterVolumeSpecName: "kube-api-access-r5qx2") pod "1f3e2e20-cc04-41cd-94df-e0748036144a" (UID: "1f3e2e20-cc04-41cd-94df-e0748036144a"). InnerVolumeSpecName "kube-api-access-r5qx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.400787 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6zn8b" event={"ID":"6f5039b6-0875-4dd9-a5c4-9e6849dbb221","Type":"ContainerDied","Data":"bf4070ad550c15a0dd5494c58dbf5235b969ff019c891961e18562fa3e33ca18"} Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.400847 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf4070ad550c15a0dd5494c58dbf5235b969ff019c891961e18562fa3e33ca18" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.400933 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6zn8b" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.402773 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-khd2f" event={"ID":"019f8649-b37e-4970-a742-33afa217a2b4","Type":"ContainerStarted","Data":"0e4d94548c85906251bc8fe747697a3172511be772c9842c23bee31dc5c05b93"} Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.404793 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bjk8x" event={"ID":"1f3e2e20-cc04-41cd-94df-e0748036144a","Type":"ContainerDied","Data":"c547ac6e5fb0c4ea2415a88e4ea43485bd86ef49504fb84ae27ad0d34fb553a0"} Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.404837 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c547ac6e5fb0c4ea2415a88e4ea43485bd86ef49504fb84ae27ad0d34fb553a0" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.404976 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bjk8x" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.407278 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f18c-account-create-update-qp7p8" event={"ID":"26927571-4070-4160-9e19-82f06d7d2a06","Type":"ContainerDied","Data":"af57c1d43c2f4f8519f8e34626f8766440f9cc8d09f4ed4e2e9c923bc493b5c1"} Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.407309 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f18c-account-create-update-qp7p8" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.407320 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af57c1d43c2f4f8519f8e34626f8766440f9cc8d09f4ed4e2e9c923bc493b5c1" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.410459 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fea25d07-8cbc-4875-89e8-1752b0ee2a9e","Type":"ContainerStarted","Data":"e319cc16223c7a30d6e0776522afe1488d811aedd1ad015b0deb95be537c5671"} Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.411745 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26927571-4070-4160-9e19-82f06d7d2a06-operator-scripts\") pod \"26927571-4070-4160-9e19-82f06d7d2a06\" (UID: \"26927571-4070-4160-9e19-82f06d7d2a06\") " Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.411859 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nm6fn\" (UniqueName: \"kubernetes.io/projected/26927571-4070-4160-9e19-82f06d7d2a06-kube-api-access-nm6fn\") pod \"26927571-4070-4160-9e19-82f06d7d2a06\" (UID: \"26927571-4070-4160-9e19-82f06d7d2a06\") " Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.411897 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc8vc\" (UniqueName: \"kubernetes.io/projected/6f5039b6-0875-4dd9-a5c4-9e6849dbb221-kube-api-access-pc8vc\") pod \"6f5039b6-0875-4dd9-a5c4-9e6849dbb221\" (UID: \"6f5039b6-0875-4dd9-a5c4-9e6849dbb221\") " Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.412032 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f5039b6-0875-4dd9-a5c4-9e6849dbb221-operator-scripts\") pod \"6f5039b6-0875-4dd9-a5c4-9e6849dbb221\" (UID: \"6f5039b6-0875-4dd9-a5c4-9e6849dbb221\") " Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.412516 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5qx2\" (UniqueName: \"kubernetes.io/projected/1f3e2e20-cc04-41cd-94df-e0748036144a-kube-api-access-r5qx2\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.412555 4734 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f3e2e20-cc04-41cd-94df-e0748036144a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.412565 4734 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e04b323-6b27-4e58-9688-a7bc57317e6e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.412575 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx45z\" (UniqueName: \"kubernetes.io/projected/fe0296c8-8413-4c49-ae9c-e68e1dcbdb03-kube-api-access-mx45z\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.412585 4734 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe0296c8-8413-4c49-ae9c-e68e1dcbdb03-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.412594 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwcq5\" (UniqueName: \"kubernetes.io/projected/8e04b323-6b27-4e58-9688-a7bc57317e6e-kube-api-access-pwcq5\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.412605 4734 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c2002ce-57e4-45bd-9110-9f7ebd50d0e7-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.412615 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj62p\" (UniqueName: \"kubernetes.io/projected/9c2002ce-57e4-45bd-9110-9f7ebd50d0e7-kube-api-access-sj62p\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.413331 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26927571-4070-4160-9e19-82f06d7d2a06-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "26927571-4070-4160-9e19-82f06d7d2a06" (UID: "26927571-4070-4160-9e19-82f06d7d2a06"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.413395 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f5039b6-0875-4dd9-a5c4-9e6849dbb221-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6f5039b6-0875-4dd9-a5c4-9e6849dbb221" (UID: "6f5039b6-0875-4dd9-a5c4-9e6849dbb221"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.413705 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a7d4-account-create-update-6mgtp" event={"ID":"fe0296c8-8413-4c49-ae9c-e68e1dcbdb03","Type":"ContainerDied","Data":"ec6d095a4be2cf01b5138f866f3001bf3e354e13b406ee6105b63589d14cfacf"} Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.413733 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec6d095a4be2cf01b5138f866f3001bf3e354e13b406ee6105b63589d14cfacf" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.414304 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a7d4-account-create-update-6mgtp" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.415312 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e6fc-account-create-update-whgg9" event={"ID":"8e04b323-6b27-4e58-9688-a7bc57317e6e","Type":"ContainerDied","Data":"1e7ef7a970e395010cdf183b6fcef490f015ee501ef1c4be92d7a443f0061af4"} Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.415347 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e7ef7a970e395010cdf183b6fcef490f015ee501ef1c4be92d7a443f0061af4" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.415424 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e6fc-account-create-update-whgg9" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.418227 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f5039b6-0875-4dd9-a5c4-9e6849dbb221-kube-api-access-pc8vc" (OuterVolumeSpecName: "kube-api-access-pc8vc") pod "6f5039b6-0875-4dd9-a5c4-9e6849dbb221" (UID: "6f5039b6-0875-4dd9-a5c4-9e6849dbb221"). InnerVolumeSpecName "kube-api-access-pc8vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.419930 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26927571-4070-4160-9e19-82f06d7d2a06-kube-api-access-nm6fn" (OuterVolumeSpecName: "kube-api-access-nm6fn") pod "26927571-4070-4160-9e19-82f06d7d2a06" (UID: "26927571-4070-4160-9e19-82f06d7d2a06"). InnerVolumeSpecName "kube-api-access-nm6fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.427111 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-s5w2r" event={"ID":"9c2002ce-57e4-45bd-9110-9f7ebd50d0e7","Type":"ContainerDied","Data":"53b4620166eef8ca83f671728689e77abceaa99c7420ec36c0c26d57760e5539"} Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.427184 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53b4620166eef8ca83f671728689e77abceaa99c7420ec36c0c26d57760e5539" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.427513 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-s5w2r" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.429134 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-khd2f" podStartSLOduration=11.30986195 podStartE2EDuration="17.429122765s" podCreationTimestamp="2025-12-05 23:37:32 +0000 UTC" firstStartedPulling="2025-12-05 23:37:42.909833222 +0000 UTC m=+1083.593237498" lastFinishedPulling="2025-12-05 23:37:49.029094037 +0000 UTC m=+1089.712498313" observedRunningTime="2025-12-05 23:37:49.42224888 +0000 UTC m=+1090.105653166" watchObservedRunningTime="2025-12-05 23:37:49.429122765 +0000 UTC m=+1090.112527041" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.513814 4734 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f5039b6-0875-4dd9-a5c4-9e6849dbb221-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.514256 4734 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26927571-4070-4160-9e19-82f06d7d2a06-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.514268 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nm6fn\" (UniqueName: \"kubernetes.io/projected/26927571-4070-4160-9e19-82f06d7d2a06-kube-api-access-nm6fn\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:49 crc kubenswrapper[4734]: I1205 23:37:49.514280 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc8vc\" (UniqueName: \"kubernetes.io/projected/6f5039b6-0875-4dd9-a5c4-9e6849dbb221-kube-api-access-pc8vc\") on node \"crc\" DevicePath \"\"" Dec 05 23:37:50 crc kubenswrapper[4734]: I1205 23:37:50.447896 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fea25d07-8cbc-4875-89e8-1752b0ee2a9e","Type":"ContainerStarted","Data":"6a0232b2434cfcc6fcc078bc19cf8b70c141ad1a2d491cc295354a7ba0bec833"} Dec 05 23:37:50 crc kubenswrapper[4734]: I1205 23:37:50.447954 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fea25d07-8cbc-4875-89e8-1752b0ee2a9e","Type":"ContainerStarted","Data":"9824f4feef6f30b58ceec9b68de75ac248da3eba89c6a62c5713996f081398a5"} Dec 05 23:37:51 crc kubenswrapper[4734]: I1205 23:37:51.461631 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fea25d07-8cbc-4875-89e8-1752b0ee2a9e","Type":"ContainerStarted","Data":"ef08bc2e0ca34cb9973a58179d9669899fd966901db429a83a4f8a925f5a5f17"} Dec 05 23:37:52 crc kubenswrapper[4734]: I1205 23:37:52.474507 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fea25d07-8cbc-4875-89e8-1752b0ee2a9e","Type":"ContainerStarted","Data":"6e874d496bb1c0a072a504675ac3e72d62b2dc83d07bb21282f6a034907986b3"} Dec 05 23:37:54 crc kubenswrapper[4734]: I1205 23:37:54.498450 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fea25d07-8cbc-4875-89e8-1752b0ee2a9e","Type":"ContainerStarted","Data":"e9a6c6611298ffa1bdc7793439ae3a59f0964a46044535883b248b5fa9bbe864"} Dec 05 23:37:57 crc kubenswrapper[4734]: I1205 23:37:57.527981 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wzn74" event={"ID":"741e9328-bc42-4fae-b3dd-316f3286fa42","Type":"ContainerStarted","Data":"d573c95c457ec42a3ba6fba3952780a656918af85cc999b56e5043e47c97c283"} Dec 05 23:37:57 crc kubenswrapper[4734]: I1205 23:37:57.534949 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fea25d07-8cbc-4875-89e8-1752b0ee2a9e","Type":"ContainerStarted","Data":"aeb70926ab01517662cde875aa7c575bc8cfca0b731e8440babaaf8f93117ae7"} Dec 05 23:37:57 crc kubenswrapper[4734]: I1205 23:37:57.551008 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-wzn74" podStartSLOduration=2.877892572 podStartE2EDuration="38.550979412s" podCreationTimestamp="2025-12-05 23:37:19 +0000 UTC" firstStartedPulling="2025-12-05 23:37:21.039315082 +0000 UTC m=+1061.722719358" lastFinishedPulling="2025-12-05 23:37:56.712401922 +0000 UTC m=+1097.395806198" observedRunningTime="2025-12-05 23:37:57.546187016 +0000 UTC m=+1098.229591302" watchObservedRunningTime="2025-12-05 23:37:57.550979412 +0000 UTC m=+1098.234383698" Dec 05 23:37:59 crc kubenswrapper[4734]: I1205 23:37:59.553863 4734 generic.go:334] "Generic (PLEG): container finished" podID="019f8649-b37e-4970-a742-33afa217a2b4" containerID="0e4d94548c85906251bc8fe747697a3172511be772c9842c23bee31dc5c05b93" exitCode=0 Dec 05 23:37:59 crc kubenswrapper[4734]: I1205 23:37:59.553976 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-khd2f" event={"ID":"019f8649-b37e-4970-a742-33afa217a2b4","Type":"ContainerDied","Data":"0e4d94548c85906251bc8fe747697a3172511be772c9842c23bee31dc5c05b93"} Dec 05 23:38:00 crc kubenswrapper[4734]: I1205 23:38:00.574469 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fea25d07-8cbc-4875-89e8-1752b0ee2a9e","Type":"ContainerStarted","Data":"746bf6191839d6d7ac902acfa3977944518e9c2edf5050e34bc94670fcad6066"} Dec 05 23:38:00 crc kubenswrapper[4734]: I1205 23:38:00.575878 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fea25d07-8cbc-4875-89e8-1752b0ee2a9e","Type":"ContainerStarted","Data":"11f7844128f22405eaf719fe5aa1acfcb4235f602549731404df632beda13edf"} Dec 05 23:38:00 crc kubenswrapper[4734]: I1205 23:38:00.575897 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fea25d07-8cbc-4875-89e8-1752b0ee2a9e","Type":"ContainerStarted","Data":"fc1207d9105c7a2d363969c3d4e2bf2c835693e9f57dfa8dcda59f3955809ae1"} Dec 05 23:38:00 crc kubenswrapper[4734]: I1205 23:38:00.575909 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fea25d07-8cbc-4875-89e8-1752b0ee2a9e","Type":"ContainerStarted","Data":"2d833c9705a68e56d64f1012ad4f703ef87a0078f89e0b492c6840c376363844"} Dec 05 23:38:00 crc kubenswrapper[4734]: I1205 23:38:00.575920 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fea25d07-8cbc-4875-89e8-1752b0ee2a9e","Type":"ContainerStarted","Data":"7df070704687fea6c300a329ba4dcd7881fa97cc89e034d5d85477259467c48f"} Dec 05 23:38:00 crc kubenswrapper[4734]: I1205 23:38:00.938614 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-khd2f" Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.055395 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/019f8649-b37e-4970-a742-33afa217a2b4-config-data\") pod \"019f8649-b37e-4970-a742-33afa217a2b4\" (UID: \"019f8649-b37e-4970-a742-33afa217a2b4\") " Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.055736 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019f8649-b37e-4970-a742-33afa217a2b4-combined-ca-bundle\") pod \"019f8649-b37e-4970-a742-33afa217a2b4\" (UID: \"019f8649-b37e-4970-a742-33afa217a2b4\") " Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.055836 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc4gl\" (UniqueName: \"kubernetes.io/projected/019f8649-b37e-4970-a742-33afa217a2b4-kube-api-access-kc4gl\") pod \"019f8649-b37e-4970-a742-33afa217a2b4\" (UID: \"019f8649-b37e-4970-a742-33afa217a2b4\") " Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.070966 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/019f8649-b37e-4970-a742-33afa217a2b4-kube-api-access-kc4gl" (OuterVolumeSpecName: "kube-api-access-kc4gl") pod "019f8649-b37e-4970-a742-33afa217a2b4" (UID: "019f8649-b37e-4970-a742-33afa217a2b4"). InnerVolumeSpecName "kube-api-access-kc4gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.087250 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/019f8649-b37e-4970-a742-33afa217a2b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "019f8649-b37e-4970-a742-33afa217a2b4" (UID: "019f8649-b37e-4970-a742-33afa217a2b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.109384 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/019f8649-b37e-4970-a742-33afa217a2b4-config-data" (OuterVolumeSpecName: "config-data") pod "019f8649-b37e-4970-a742-33afa217a2b4" (UID: "019f8649-b37e-4970-a742-33afa217a2b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.159820 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019f8649-b37e-4970-a742-33afa217a2b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.159857 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kc4gl\" (UniqueName: \"kubernetes.io/projected/019f8649-b37e-4970-a742-33afa217a2b4-kube-api-access-kc4gl\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.159871 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/019f8649-b37e-4970-a742-33afa217a2b4-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.586032 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-khd2f" event={"ID":"019f8649-b37e-4970-a742-33afa217a2b4","Type":"ContainerDied","Data":"cb00d76f20f9c41bd3a668a07795293482c273004c765b77f84d9d0bfe4cd9db"} Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.586080 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb00d76f20f9c41bd3a668a07795293482c273004c765b77f84d9d0bfe4cd9db" Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.586151 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-khd2f" Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.597050 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fea25d07-8cbc-4875-89e8-1752b0ee2a9e","Type":"ContainerStarted","Data":"186c29b399eefadc6da0a7ab27f2e36f85f95708902f32b2dce00314ad3e026e"} Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.597116 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fea25d07-8cbc-4875-89e8-1752b0ee2a9e","Type":"ContainerStarted","Data":"148d2ecb0a14facd4034f6552d196e47722e939e163f6dc32a63e1c35f219cb3"} Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.646356 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=39.367553232 podStartE2EDuration="55.646327304s" podCreationTimestamp="2025-12-05 23:37:06 +0000 UTC" firstStartedPulling="2025-12-05 23:37:43.11413582 +0000 UTC m=+1083.797540096" lastFinishedPulling="2025-12-05 23:37:59.392909882 +0000 UTC m=+1100.076314168" observedRunningTime="2025-12-05 23:38:01.637589292 +0000 UTC m=+1102.320993588" watchObservedRunningTime="2025-12-05 23:38:01.646327304 +0000 UTC m=+1102.329731580" Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.942883 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-wjkcq"] Dec 05 23:38:01 crc kubenswrapper[4734]: E1205 23:38:01.953059 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e04b323-6b27-4e58-9688-a7bc57317e6e" containerName="mariadb-account-create-update" Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.953102 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e04b323-6b27-4e58-9688-a7bc57317e6e" containerName="mariadb-account-create-update" Dec 05 23:38:01 crc kubenswrapper[4734]: E1205 23:38:01.953125 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c2002ce-57e4-45bd-9110-9f7ebd50d0e7" containerName="mariadb-database-create" Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.953439 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c2002ce-57e4-45bd-9110-9f7ebd50d0e7" containerName="mariadb-database-create" Dec 05 23:38:01 crc kubenswrapper[4734]: E1205 23:38:01.953480 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8162a1e5-af9e-495c-926f-e83571a03720" containerName="ovn-config" Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.953487 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="8162a1e5-af9e-495c-926f-e83571a03720" containerName="ovn-config" Dec 05 23:38:01 crc kubenswrapper[4734]: E1205 23:38:01.953513 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f5039b6-0875-4dd9-a5c4-9e6849dbb221" containerName="mariadb-database-create" Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.953523 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f5039b6-0875-4dd9-a5c4-9e6849dbb221" containerName="mariadb-database-create" Dec 05 23:38:01 crc kubenswrapper[4734]: E1205 23:38:01.953594 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="019f8649-b37e-4970-a742-33afa217a2b4" containerName="keystone-db-sync" Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.953602 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="019f8649-b37e-4970-a742-33afa217a2b4" containerName="keystone-db-sync" Dec 05 23:38:01 crc kubenswrapper[4734]: E1205 23:38:01.953616 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0296c8-8413-4c49-ae9c-e68e1dcbdb03" containerName="mariadb-account-create-update" Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.953622 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0296c8-8413-4c49-ae9c-e68e1dcbdb03" containerName="mariadb-account-create-update" Dec 05 23:38:01 crc kubenswrapper[4734]: E1205 23:38:01.953654 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f3e2e20-cc04-41cd-94df-e0748036144a" containerName="mariadb-database-create" Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.953664 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f3e2e20-cc04-41cd-94df-e0748036144a" containerName="mariadb-database-create" Dec 05 23:38:01 crc kubenswrapper[4734]: E1205 23:38:01.953682 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26927571-4070-4160-9e19-82f06d7d2a06" containerName="mariadb-account-create-update" Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.953688 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="26927571-4070-4160-9e19-82f06d7d2a06" containerName="mariadb-account-create-update" Dec 05 23:38:01 crc kubenswrapper[4734]: E1205 23:38:01.953716 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1e03821-b44b-4ce9-8fb9-6831bf8b087f" containerName="swift-ring-rebalance" Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.953723 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1e03821-b44b-4ce9-8fb9-6831bf8b087f" containerName="swift-ring-rebalance" Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.966235 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="26927571-4070-4160-9e19-82f06d7d2a06" containerName="mariadb-account-create-update" Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.966509 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1e03821-b44b-4ce9-8fb9-6831bf8b087f" containerName="swift-ring-rebalance" Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.966810 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="8162a1e5-af9e-495c-926f-e83571a03720" containerName="ovn-config" Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.966843 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e04b323-6b27-4e58-9688-a7bc57317e6e" containerName="mariadb-account-create-update" Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.967219 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="019f8649-b37e-4970-a742-33afa217a2b4" containerName="keystone-db-sync" Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.967335 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c2002ce-57e4-45bd-9110-9f7ebd50d0e7" containerName="mariadb-database-create" Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.967370 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f5039b6-0875-4dd9-a5c4-9e6849dbb221" containerName="mariadb-database-create" Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.967399 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f3e2e20-cc04-41cd-94df-e0748036144a" containerName="mariadb-database-create" Dec 05 23:38:01 crc kubenswrapper[4734]: I1205 23:38:01.967422 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0296c8-8413-4c49-ae9c-e68e1dcbdb03" containerName="mariadb-account-create-update" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.019116 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-wjkcq" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.068038 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-wjkcq"] Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.106401 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-g8vv2"] Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.107869 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-g8vv2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.125028 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qf7c2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.126659 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5rsx\" (UniqueName: \"kubernetes.io/projected/70c2f508-fa86-4b19-96f2-4311b58ad70f-kube-api-access-t5rsx\") pod \"dnsmasq-dns-f877ddd87-wjkcq\" (UID: \"70c2f508-fa86-4b19-96f2-4311b58ad70f\") " pod="openstack/dnsmasq-dns-f877ddd87-wjkcq" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.126729 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70c2f508-fa86-4b19-96f2-4311b58ad70f-dns-svc\") pod \"dnsmasq-dns-f877ddd87-wjkcq\" (UID: \"70c2f508-fa86-4b19-96f2-4311b58ad70f\") " pod="openstack/dnsmasq-dns-f877ddd87-wjkcq" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.126756 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70c2f508-fa86-4b19-96f2-4311b58ad70f-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-wjkcq\" (UID: \"70c2f508-fa86-4b19-96f2-4311b58ad70f\") " pod="openstack/dnsmasq-dns-f877ddd87-wjkcq" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.126775 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70c2f508-fa86-4b19-96f2-4311b58ad70f-config\") pod \"dnsmasq-dns-f877ddd87-wjkcq\" (UID: \"70c2f508-fa86-4b19-96f2-4311b58ad70f\") " pod="openstack/dnsmasq-dns-f877ddd87-wjkcq" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.126820 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70c2f508-fa86-4b19-96f2-4311b58ad70f-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-wjkcq\" (UID: \"70c2f508-fa86-4b19-96f2-4311b58ad70f\") " pod="openstack/dnsmasq-dns-f877ddd87-wjkcq" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.133584 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.133651 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.133665 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.134261 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.135005 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-g8vv2"] Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.172874 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-wjkcq"] Dec 05 23:38:02 crc kubenswrapper[4734]: E1205 23:38:02.173874 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-t5rsx ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-f877ddd87-wjkcq" podUID="70c2f508-fa86-4b19-96f2-4311b58ad70f" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.208489 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-769b7bcdf7-h7m8l"] Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.210685 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-769b7bcdf7-h7m8l" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.215703 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.216457 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-c2zdd" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.226682 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.227439 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.228059 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70c2f508-fa86-4b19-96f2-4311b58ad70f-dns-svc\") pod \"dnsmasq-dns-f877ddd87-wjkcq\" (UID: \"70c2f508-fa86-4b19-96f2-4311b58ad70f\") " pod="openstack/dnsmasq-dns-f877ddd87-wjkcq" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.228108 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-scripts\") pod \"keystone-bootstrap-g8vv2\" (UID: \"d9db3688-e0c9-423a-9c15-406e359fec75\") " pod="openstack/keystone-bootstrap-g8vv2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.228156 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70c2f508-fa86-4b19-96f2-4311b58ad70f-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-wjkcq\" (UID: \"70c2f508-fa86-4b19-96f2-4311b58ad70f\") " pod="openstack/dnsmasq-dns-f877ddd87-wjkcq" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.228187 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70c2f508-fa86-4b19-96f2-4311b58ad70f-config\") pod \"dnsmasq-dns-f877ddd87-wjkcq\" (UID: \"70c2f508-fa86-4b19-96f2-4311b58ad70f\") " pod="openstack/dnsmasq-dns-f877ddd87-wjkcq" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.228244 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-credential-keys\") pod \"keystone-bootstrap-g8vv2\" (UID: \"d9db3688-e0c9-423a-9c15-406e359fec75\") " pod="openstack/keystone-bootstrap-g8vv2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.228265 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70c2f508-fa86-4b19-96f2-4311b58ad70f-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-wjkcq\" (UID: \"70c2f508-fa86-4b19-96f2-4311b58ad70f\") " pod="openstack/dnsmasq-dns-f877ddd87-wjkcq" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.228293 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-config-data\") pod \"keystone-bootstrap-g8vv2\" (UID: \"d9db3688-e0c9-423a-9c15-406e359fec75\") " pod="openstack/keystone-bootstrap-g8vv2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.228314 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f58q8\" (UniqueName: \"kubernetes.io/projected/d9db3688-e0c9-423a-9c15-406e359fec75-kube-api-access-f58q8\") pod \"keystone-bootstrap-g8vv2\" (UID: \"d9db3688-e0c9-423a-9c15-406e359fec75\") " pod="openstack/keystone-bootstrap-g8vv2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.228369 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-fernet-keys\") pod \"keystone-bootstrap-g8vv2\" (UID: \"d9db3688-e0c9-423a-9c15-406e359fec75\") " pod="openstack/keystone-bootstrap-g8vv2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.228384 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-combined-ca-bundle\") pod \"keystone-bootstrap-g8vv2\" (UID: \"d9db3688-e0c9-423a-9c15-406e359fec75\") " pod="openstack/keystone-bootstrap-g8vv2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.228412 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5rsx\" (UniqueName: \"kubernetes.io/projected/70c2f508-fa86-4b19-96f2-4311b58ad70f-kube-api-access-t5rsx\") pod \"dnsmasq-dns-f877ddd87-wjkcq\" (UID: \"70c2f508-fa86-4b19-96f2-4311b58ad70f\") " pod="openstack/dnsmasq-dns-f877ddd87-wjkcq" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.238328 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70c2f508-fa86-4b19-96f2-4311b58ad70f-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-wjkcq\" (UID: \"70c2f508-fa86-4b19-96f2-4311b58ad70f\") " pod="openstack/dnsmasq-dns-f877ddd87-wjkcq" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.246431 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70c2f508-fa86-4b19-96f2-4311b58ad70f-dns-svc\") pod \"dnsmasq-dns-f877ddd87-wjkcq\" (UID: \"70c2f508-fa86-4b19-96f2-4311b58ad70f\") " pod="openstack/dnsmasq-dns-f877ddd87-wjkcq" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.248836 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-xtrp6"] Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.250468 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-xtrp6" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.251544 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70c2f508-fa86-4b19-96f2-4311b58ad70f-config\") pod \"dnsmasq-dns-f877ddd87-wjkcq\" (UID: \"70c2f508-fa86-4b19-96f2-4311b58ad70f\") " pod="openstack/dnsmasq-dns-f877ddd87-wjkcq" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.252295 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70c2f508-fa86-4b19-96f2-4311b58ad70f-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-wjkcq\" (UID: \"70c2f508-fa86-4b19-96f2-4311b58ad70f\") " pod="openstack/dnsmasq-dns-f877ddd87-wjkcq" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.261852 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.306948 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5rsx\" (UniqueName: \"kubernetes.io/projected/70c2f508-fa86-4b19-96f2-4311b58ad70f-kube-api-access-t5rsx\") pod \"dnsmasq-dns-f877ddd87-wjkcq\" (UID: \"70c2f508-fa86-4b19-96f2-4311b58ad70f\") " pod="openstack/dnsmasq-dns-f877ddd87-wjkcq" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.317242 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-769b7bcdf7-h7m8l"] Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.338426 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-xtrp6\" (UID: \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\") " pod="openstack/dnsmasq-dns-5959f8865f-xtrp6" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.338517 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-scripts\") pod \"keystone-bootstrap-g8vv2\" (UID: \"d9db3688-e0c9-423a-9c15-406e359fec75\") " pod="openstack/keystone-bootstrap-g8vv2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.338657 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-config\") pod \"dnsmasq-dns-5959f8865f-xtrp6\" (UID: \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\") " pod="openstack/dnsmasq-dns-5959f8865f-xtrp6" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.338731 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-credential-keys\") pod \"keystone-bootstrap-g8vv2\" (UID: \"d9db3688-e0c9-423a-9c15-406e359fec75\") " pod="openstack/keystone-bootstrap-g8vv2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.338767 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-scripts\") pod \"horizon-769b7bcdf7-h7m8l\" (UID: \"b9dc8ed3-eab2-486b-9001-8ced7ac9ac22\") " pod="openstack/horizon-769b7bcdf7-h7m8l" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.338808 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-config-data\") pod \"keystone-bootstrap-g8vv2\" (UID: \"d9db3688-e0c9-423a-9c15-406e359fec75\") " pod="openstack/keystone-bootstrap-g8vv2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.338835 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f58q8\" (UniqueName: \"kubernetes.io/projected/d9db3688-e0c9-423a-9c15-406e359fec75-kube-api-access-f58q8\") pod \"keystone-bootstrap-g8vv2\" (UID: \"d9db3688-e0c9-423a-9c15-406e359fec75\") " pod="openstack/keystone-bootstrap-g8vv2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.338871 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-logs\") pod \"horizon-769b7bcdf7-h7m8l\" (UID: \"b9dc8ed3-eab2-486b-9001-8ced7ac9ac22\") " pod="openstack/horizon-769b7bcdf7-h7m8l" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.338903 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5px8w\" (UniqueName: \"kubernetes.io/projected/d0d1c6ca-ba99-4296-b215-d985a6cd3402-kube-api-access-5px8w\") pod \"dnsmasq-dns-5959f8865f-xtrp6\" (UID: \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\") " pod="openstack/dnsmasq-dns-5959f8865f-xtrp6" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.341665 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-xtrp6\" (UID: \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\") " pod="openstack/dnsmasq-dns-5959f8865f-xtrp6" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.341712 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-config-data\") pod \"horizon-769b7bcdf7-h7m8l\" (UID: \"b9dc8ed3-eab2-486b-9001-8ced7ac9ac22\") " pod="openstack/horizon-769b7bcdf7-h7m8l" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.341799 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-horizon-secret-key\") pod \"horizon-769b7bcdf7-h7m8l\" (UID: \"b9dc8ed3-eab2-486b-9001-8ced7ac9ac22\") " pod="openstack/horizon-769b7bcdf7-h7m8l" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.341827 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-xtrp6\" (UID: \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\") " pod="openstack/dnsmasq-dns-5959f8865f-xtrp6" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.341845 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-dns-svc\") pod \"dnsmasq-dns-5959f8865f-xtrp6\" (UID: \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\") " pod="openstack/dnsmasq-dns-5959f8865f-xtrp6" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.341877 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f9nc\" (UniqueName: \"kubernetes.io/projected/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-kube-api-access-4f9nc\") pod \"horizon-769b7bcdf7-h7m8l\" (UID: \"b9dc8ed3-eab2-486b-9001-8ced7ac9ac22\") " pod="openstack/horizon-769b7bcdf7-h7m8l" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.341903 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-combined-ca-bundle\") pod \"keystone-bootstrap-g8vv2\" (UID: \"d9db3688-e0c9-423a-9c15-406e359fec75\") " pod="openstack/keystone-bootstrap-g8vv2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.341922 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-fernet-keys\") pod \"keystone-bootstrap-g8vv2\" (UID: \"d9db3688-e0c9-423a-9c15-406e359fec75\") " pod="openstack/keystone-bootstrap-g8vv2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.347262 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-fernet-keys\") pod \"keystone-bootstrap-g8vv2\" (UID: \"d9db3688-e0c9-423a-9c15-406e359fec75\") " pod="openstack/keystone-bootstrap-g8vv2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.358937 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-config-data\") pod \"keystone-bootstrap-g8vv2\" (UID: \"d9db3688-e0c9-423a-9c15-406e359fec75\") " pod="openstack/keystone-bootstrap-g8vv2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.359817 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-xtrp6"] Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.361115 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-scripts\") pod \"keystone-bootstrap-g8vv2\" (UID: \"d9db3688-e0c9-423a-9c15-406e359fec75\") " pod="openstack/keystone-bootstrap-g8vv2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.361325 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-combined-ca-bundle\") pod \"keystone-bootstrap-g8vv2\" (UID: \"d9db3688-e0c9-423a-9c15-406e359fec75\") " pod="openstack/keystone-bootstrap-g8vv2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.361677 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-credential-keys\") pod \"keystone-bootstrap-g8vv2\" (UID: \"d9db3688-e0c9-423a-9c15-406e359fec75\") " pod="openstack/keystone-bootstrap-g8vv2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.385921 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-xsvx9"] Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.388931 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xsvx9" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.396273 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.396645 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.417425 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f58q8\" (UniqueName: \"kubernetes.io/projected/d9db3688-e0c9-423a-9c15-406e359fec75-kube-api-access-f58q8\") pod \"keystone-bootstrap-g8vv2\" (UID: \"d9db3688-e0c9-423a-9c15-406e359fec75\") " pod="openstack/keystone-bootstrap-g8vv2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.417461 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-rgfg5" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.439519 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.445650 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-scripts\") pod \"horizon-769b7bcdf7-h7m8l\" (UID: \"b9dc8ed3-eab2-486b-9001-8ced7ac9ac22\") " pod="openstack/horizon-769b7bcdf7-h7m8l" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.445717 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-logs\") pod \"horizon-769b7bcdf7-h7m8l\" (UID: \"b9dc8ed3-eab2-486b-9001-8ced7ac9ac22\") " pod="openstack/horizon-769b7bcdf7-h7m8l" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.445740 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5px8w\" (UniqueName: \"kubernetes.io/projected/d0d1c6ca-ba99-4296-b215-d985a6cd3402-kube-api-access-5px8w\") pod \"dnsmasq-dns-5959f8865f-xtrp6\" (UID: \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\") " pod="openstack/dnsmasq-dns-5959f8865f-xtrp6" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.445757 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-xtrp6\" (UID: \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\") " pod="openstack/dnsmasq-dns-5959f8865f-xtrp6" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.445776 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-config-data\") pod \"horizon-769b7bcdf7-h7m8l\" (UID: \"b9dc8ed3-eab2-486b-9001-8ced7ac9ac22\") " pod="openstack/horizon-769b7bcdf7-h7m8l" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.445811 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-horizon-secret-key\") pod \"horizon-769b7bcdf7-h7m8l\" (UID: \"b9dc8ed3-eab2-486b-9001-8ced7ac9ac22\") " pod="openstack/horizon-769b7bcdf7-h7m8l" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.445829 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-xtrp6\" (UID: \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\") " pod="openstack/dnsmasq-dns-5959f8865f-xtrp6" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.445846 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-dns-svc\") pod \"dnsmasq-dns-5959f8865f-xtrp6\" (UID: \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\") " pod="openstack/dnsmasq-dns-5959f8865f-xtrp6" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.445864 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f9nc\" (UniqueName: \"kubernetes.io/projected/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-kube-api-access-4f9nc\") pod \"horizon-769b7bcdf7-h7m8l\" (UID: \"b9dc8ed3-eab2-486b-9001-8ced7ac9ac22\") " pod="openstack/horizon-769b7bcdf7-h7m8l" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.445907 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-xtrp6\" (UID: \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\") " pod="openstack/dnsmasq-dns-5959f8865f-xtrp6" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.445951 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-config\") pod \"dnsmasq-dns-5959f8865f-xtrp6\" (UID: \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\") " pod="openstack/dnsmasq-dns-5959f8865f-xtrp6" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.453842 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-config-data\") pod \"horizon-769b7bcdf7-h7m8l\" (UID: \"b9dc8ed3-eab2-486b-9001-8ced7ac9ac22\") " pod="openstack/horizon-769b7bcdf7-h7m8l" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.454427 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-xtrp6\" (UID: \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\") " pod="openstack/dnsmasq-dns-5959f8865f-xtrp6" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.454994 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-scripts\") pod \"horizon-769b7bcdf7-h7m8l\" (UID: \"b9dc8ed3-eab2-486b-9001-8ced7ac9ac22\") " pod="openstack/horizon-769b7bcdf7-h7m8l" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.454991 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-dns-svc\") pod \"dnsmasq-dns-5959f8865f-xtrp6\" (UID: \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\") " pod="openstack/dnsmasq-dns-5959f8865f-xtrp6" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.455886 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-xtrp6\" (UID: \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\") " pod="openstack/dnsmasq-dns-5959f8865f-xtrp6" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.456062 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-g8vv2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.457793 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-xtrp6\" (UID: \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\") " pod="openstack/dnsmasq-dns-5959f8865f-xtrp6" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.459780 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-logs\") pod \"horizon-769b7bcdf7-h7m8l\" (UID: \"b9dc8ed3-eab2-486b-9001-8ced7ac9ac22\") " pod="openstack/horizon-769b7bcdf7-h7m8l" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.460810 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.464433 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-config\") pod \"dnsmasq-dns-5959f8865f-xtrp6\" (UID: \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\") " pod="openstack/dnsmasq-dns-5959f8865f-xtrp6" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.475619 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-xsvx9"] Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.481644 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.482770 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.488809 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-horizon-secret-key\") pod \"horizon-769b7bcdf7-h7m8l\" (UID: \"b9dc8ed3-eab2-486b-9001-8ced7ac9ac22\") " pod="openstack/horizon-769b7bcdf7-h7m8l" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.493213 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f9nc\" (UniqueName: \"kubernetes.io/projected/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-kube-api-access-4f9nc\") pod \"horizon-769b7bcdf7-h7m8l\" (UID: \"b9dc8ed3-eab2-486b-9001-8ced7ac9ac22\") " pod="openstack/horizon-769b7bcdf7-h7m8l" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.503222 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5px8w\" (UniqueName: \"kubernetes.io/projected/d0d1c6ca-ba99-4296-b215-d985a6cd3402-kube-api-access-5px8w\") pod \"dnsmasq-dns-5959f8865f-xtrp6\" (UID: \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\") " pod="openstack/dnsmasq-dns-5959f8865f-xtrp6" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.514991 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.539798 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-58fkh"] Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.556445 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-58fkh" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.562290 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/067f11aa-41d5-4a34-9f2e-33b35981e9ba-scripts\") pod \"ceilometer-0\" (UID: \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\") " pod="openstack/ceilometer-0" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.562342 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/067f11aa-41d5-4a34-9f2e-33b35981e9ba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\") " pod="openstack/ceilometer-0" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.562378 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4c26d17f-e341-41c5-9759-c0b265fcceea-db-sync-config-data\") pod \"cinder-db-sync-xsvx9\" (UID: \"4c26d17f-e341-41c5-9759-c0b265fcceea\") " pod="openstack/cinder-db-sync-xsvx9" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.562402 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4c26d17f-e341-41c5-9759-c0b265fcceea-etc-machine-id\") pod \"cinder-db-sync-xsvx9\" (UID: \"4c26d17f-e341-41c5-9759-c0b265fcceea\") " pod="openstack/cinder-db-sync-xsvx9" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.562425 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs8cz\" (UniqueName: \"kubernetes.io/projected/067f11aa-41d5-4a34-9f2e-33b35981e9ba-kube-api-access-qs8cz\") pod \"ceilometer-0\" (UID: \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\") " pod="openstack/ceilometer-0" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.562475 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/067f11aa-41d5-4a34-9f2e-33b35981e9ba-run-httpd\") pod \"ceilometer-0\" (UID: \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\") " pod="openstack/ceilometer-0" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.562580 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/067f11aa-41d5-4a34-9f2e-33b35981e9ba-config-data\") pod \"ceilometer-0\" (UID: \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\") " pod="openstack/ceilometer-0" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.562611 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c26d17f-e341-41c5-9759-c0b265fcceea-scripts\") pod \"cinder-db-sync-xsvx9\" (UID: \"4c26d17f-e341-41c5-9759-c0b265fcceea\") " pod="openstack/cinder-db-sync-xsvx9" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.562627 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/067f11aa-41d5-4a34-9f2e-33b35981e9ba-log-httpd\") pod \"ceilometer-0\" (UID: \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\") " pod="openstack/ceilometer-0" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.562647 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/067f11aa-41d5-4a34-9f2e-33b35981e9ba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\") " pod="openstack/ceilometer-0" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.562680 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c26d17f-e341-41c5-9759-c0b265fcceea-config-data\") pod \"cinder-db-sync-xsvx9\" (UID: \"4c26d17f-e341-41c5-9759-c0b265fcceea\") " pod="openstack/cinder-db-sync-xsvx9" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.562731 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf9fs\" (UniqueName: \"kubernetes.io/projected/4c26d17f-e341-41c5-9759-c0b265fcceea-kube-api-access-pf9fs\") pod \"cinder-db-sync-xsvx9\" (UID: \"4c26d17f-e341-41c5-9759-c0b265fcceea\") " pod="openstack/cinder-db-sync-xsvx9" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.562769 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c26d17f-e341-41c5-9759-c0b265fcceea-combined-ca-bundle\") pod \"cinder-db-sync-xsvx9\" (UID: \"4c26d17f-e341-41c5-9759-c0b265fcceea\") " pod="openstack/cinder-db-sync-xsvx9" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.563057 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-769b7bcdf7-h7m8l" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.567851 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-pmh6z" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.568144 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.569285 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.574574 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-58fkh"] Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.590127 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-c55dfd787-8xfc2"] Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.597231 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c55dfd787-8xfc2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.617310 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-wjkcq" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.645089 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-xtrp6" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.665064 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c427c6a-2e27-4e8d-9088-1cdad55da769-combined-ca-bundle\") pod \"neutron-db-sync-58fkh\" (UID: \"6c427c6a-2e27-4e8d-9088-1cdad55da769\") " pod="openstack/neutron-db-sync-58fkh" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.665134 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/067f11aa-41d5-4a34-9f2e-33b35981e9ba-config-data\") pod \"ceilometer-0\" (UID: \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\") " pod="openstack/ceilometer-0" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.665164 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c26d17f-e341-41c5-9759-c0b265fcceea-scripts\") pod \"cinder-db-sync-xsvx9\" (UID: \"4c26d17f-e341-41c5-9759-c0b265fcceea\") " pod="openstack/cinder-db-sync-xsvx9" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.665213 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/067f11aa-41d5-4a34-9f2e-33b35981e9ba-log-httpd\") pod \"ceilometer-0\" (UID: \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\") " pod="openstack/ceilometer-0" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.665236 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/067f11aa-41d5-4a34-9f2e-33b35981e9ba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\") " pod="openstack/ceilometer-0" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.665267 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6c427c6a-2e27-4e8d-9088-1cdad55da769-config\") pod \"neutron-db-sync-58fkh\" (UID: \"6c427c6a-2e27-4e8d-9088-1cdad55da769\") " pod="openstack/neutron-db-sync-58fkh" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.665288 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c26d17f-e341-41c5-9759-c0b265fcceea-config-data\") pod \"cinder-db-sync-xsvx9\" (UID: \"4c26d17f-e341-41c5-9759-c0b265fcceea\") " pod="openstack/cinder-db-sync-xsvx9" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.665315 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf9fs\" (UniqueName: \"kubernetes.io/projected/4c26d17f-e341-41c5-9759-c0b265fcceea-kube-api-access-pf9fs\") pod \"cinder-db-sync-xsvx9\" (UID: \"4c26d17f-e341-41c5-9759-c0b265fcceea\") " pod="openstack/cinder-db-sync-xsvx9" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.665426 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m54j6\" (UniqueName: \"kubernetes.io/projected/6c427c6a-2e27-4e8d-9088-1cdad55da769-kube-api-access-m54j6\") pod \"neutron-db-sync-58fkh\" (UID: \"6c427c6a-2e27-4e8d-9088-1cdad55da769\") " pod="openstack/neutron-db-sync-58fkh" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.665462 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c26d17f-e341-41c5-9759-c0b265fcceea-combined-ca-bundle\") pod \"cinder-db-sync-xsvx9\" (UID: \"4c26d17f-e341-41c5-9759-c0b265fcceea\") " pod="openstack/cinder-db-sync-xsvx9" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.665494 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/067f11aa-41d5-4a34-9f2e-33b35981e9ba-scripts\") pod \"ceilometer-0\" (UID: \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\") " pod="openstack/ceilometer-0" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.665514 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/067f11aa-41d5-4a34-9f2e-33b35981e9ba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\") " pod="openstack/ceilometer-0" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.665554 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4c26d17f-e341-41c5-9759-c0b265fcceea-db-sync-config-data\") pod \"cinder-db-sync-xsvx9\" (UID: \"4c26d17f-e341-41c5-9759-c0b265fcceea\") " pod="openstack/cinder-db-sync-xsvx9" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.665572 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4c26d17f-e341-41c5-9759-c0b265fcceea-etc-machine-id\") pod \"cinder-db-sync-xsvx9\" (UID: \"4c26d17f-e341-41c5-9759-c0b265fcceea\") " pod="openstack/cinder-db-sync-xsvx9" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.665590 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs8cz\" (UniqueName: \"kubernetes.io/projected/067f11aa-41d5-4a34-9f2e-33b35981e9ba-kube-api-access-qs8cz\") pod \"ceilometer-0\" (UID: \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\") " pod="openstack/ceilometer-0" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.665628 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/067f11aa-41d5-4a34-9f2e-33b35981e9ba-run-httpd\") pod \"ceilometer-0\" (UID: \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\") " pod="openstack/ceilometer-0" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.666439 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/067f11aa-41d5-4a34-9f2e-33b35981e9ba-run-httpd\") pod \"ceilometer-0\" (UID: \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\") " pod="openstack/ceilometer-0" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.670010 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/067f11aa-41d5-4a34-9f2e-33b35981e9ba-log-httpd\") pod \"ceilometer-0\" (UID: \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\") " pod="openstack/ceilometer-0" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.683089 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-c55dfd787-8xfc2"] Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.683247 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4c26d17f-e341-41c5-9759-c0b265fcceea-etc-machine-id\") pod \"cinder-db-sync-xsvx9\" (UID: \"4c26d17f-e341-41c5-9759-c0b265fcceea\") " pod="openstack/cinder-db-sync-xsvx9" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.685314 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/067f11aa-41d5-4a34-9f2e-33b35981e9ba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\") " pod="openstack/ceilometer-0" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.702949 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/067f11aa-41d5-4a34-9f2e-33b35981e9ba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\") " pod="openstack/ceilometer-0" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.712780 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c26d17f-e341-41c5-9759-c0b265fcceea-scripts\") pod \"cinder-db-sync-xsvx9\" (UID: \"4c26d17f-e341-41c5-9759-c0b265fcceea\") " pod="openstack/cinder-db-sync-xsvx9" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.714239 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-6x54r"] Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.715893 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6x54r" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.718201 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.738103 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-9bmgt" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.738495 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.740224 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-6x54r"] Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.743891 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c26d17f-e341-41c5-9759-c0b265fcceea-config-data\") pod \"cinder-db-sync-xsvx9\" (UID: \"4c26d17f-e341-41c5-9759-c0b265fcceea\") " pod="openstack/cinder-db-sync-xsvx9" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.753761 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-xtrp6"] Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.754438 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/067f11aa-41d5-4a34-9f2e-33b35981e9ba-scripts\") pod \"ceilometer-0\" (UID: \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\") " pod="openstack/ceilometer-0" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.759561 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c26d17f-e341-41c5-9759-c0b265fcceea-combined-ca-bundle\") pod \"cinder-db-sync-xsvx9\" (UID: \"4c26d17f-e341-41c5-9759-c0b265fcceea\") " pod="openstack/cinder-db-sync-xsvx9" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.768121 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/067f11aa-41d5-4a34-9f2e-33b35981e9ba-config-data\") pod \"ceilometer-0\" (UID: \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\") " pod="openstack/ceilometer-0" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.768678 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-f82cb"] Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.769824 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs8cz\" (UniqueName: \"kubernetes.io/projected/067f11aa-41d5-4a34-9f2e-33b35981e9ba-kube-api-access-qs8cz\") pod \"ceilometer-0\" (UID: \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\") " pod="openstack/ceilometer-0" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.770156 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-f82cb" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.771084 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-config-data\") pod \"horizon-c55dfd787-8xfc2\" (UID: \"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb\") " pod="openstack/horizon-c55dfd787-8xfc2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.771129 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-logs\") pod \"horizon-c55dfd787-8xfc2\" (UID: \"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb\") " pod="openstack/horizon-c55dfd787-8xfc2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.771174 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-horizon-secret-key\") pod \"horizon-c55dfd787-8xfc2\" (UID: \"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb\") " pod="openstack/horizon-c55dfd787-8xfc2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.771203 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vss5g\" (UniqueName: \"kubernetes.io/projected/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-kube-api-access-vss5g\") pod \"horizon-c55dfd787-8xfc2\" (UID: \"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb\") " pod="openstack/horizon-c55dfd787-8xfc2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.771241 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c427c6a-2e27-4e8d-9088-1cdad55da769-combined-ca-bundle\") pod \"neutron-db-sync-58fkh\" (UID: \"6c427c6a-2e27-4e8d-9088-1cdad55da769\") " pod="openstack/neutron-db-sync-58fkh" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.771297 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-scripts\") pod \"horizon-c55dfd787-8xfc2\" (UID: \"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb\") " pod="openstack/horizon-c55dfd787-8xfc2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.771321 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6c427c6a-2e27-4e8d-9088-1cdad55da769-config\") pod \"neutron-db-sync-58fkh\" (UID: \"6c427c6a-2e27-4e8d-9088-1cdad55da769\") " pod="openstack/neutron-db-sync-58fkh" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.771362 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m54j6\" (UniqueName: \"kubernetes.io/projected/6c427c6a-2e27-4e8d-9088-1cdad55da769-kube-api-access-m54j6\") pod \"neutron-db-sync-58fkh\" (UID: \"6c427c6a-2e27-4e8d-9088-1cdad55da769\") " pod="openstack/neutron-db-sync-58fkh" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.772248 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4c26d17f-e341-41c5-9759-c0b265fcceea-db-sync-config-data\") pod \"cinder-db-sync-xsvx9\" (UID: \"4c26d17f-e341-41c5-9759-c0b265fcceea\") " pod="openstack/cinder-db-sync-xsvx9" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.780177 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf9fs\" (UniqueName: \"kubernetes.io/projected/4c26d17f-e341-41c5-9759-c0b265fcceea-kube-api-access-pf9fs\") pod \"cinder-db-sync-xsvx9\" (UID: \"4c26d17f-e341-41c5-9759-c0b265fcceea\") " pod="openstack/cinder-db-sync-xsvx9" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.781013 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-db8zc" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.785976 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c427c6a-2e27-4e8d-9088-1cdad55da769-combined-ca-bundle\") pod \"neutron-db-sync-58fkh\" (UID: \"6c427c6a-2e27-4e8d-9088-1cdad55da769\") " pod="openstack/neutron-db-sync-58fkh" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.787421 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.796695 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-f82cb"] Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.797287 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.798368 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6c427c6a-2e27-4e8d-9088-1cdad55da769-config\") pod \"neutron-db-sync-58fkh\" (UID: \"6c427c6a-2e27-4e8d-9088-1cdad55da769\") " pod="openstack/neutron-db-sync-58fkh" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.864237 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m54j6\" (UniqueName: \"kubernetes.io/projected/6c427c6a-2e27-4e8d-9088-1cdad55da769-kube-api-access-m54j6\") pod \"neutron-db-sync-58fkh\" (UID: \"6c427c6a-2e27-4e8d-9088-1cdad55da769\") " pod="openstack/neutron-db-sync-58fkh" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.885918 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-58fkh" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.902587 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e97c0b2c-1294-43eb-a424-5c04e198611e-scripts\") pod \"placement-db-sync-6x54r\" (UID: \"e97c0b2c-1294-43eb-a424-5c04e198611e\") " pod="openstack/placement-db-sync-6x54r" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.913159 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d71f9558-c417-4cc7-934f-258f388cced2-db-sync-config-data\") pod \"barbican-db-sync-f82cb\" (UID: \"d71f9558-c417-4cc7-934f-258f388cced2\") " pod="openstack/barbican-db-sync-f82cb" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.913336 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-config-data\") pod \"horizon-c55dfd787-8xfc2\" (UID: \"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb\") " pod="openstack/horizon-c55dfd787-8xfc2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.913378 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqtfj\" (UniqueName: \"kubernetes.io/projected/e97c0b2c-1294-43eb-a424-5c04e198611e-kube-api-access-xqtfj\") pod \"placement-db-sync-6x54r\" (UID: \"e97c0b2c-1294-43eb-a424-5c04e198611e\") " pod="openstack/placement-db-sync-6x54r" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.913415 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-logs\") pod \"horizon-c55dfd787-8xfc2\" (UID: \"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb\") " pod="openstack/horizon-c55dfd787-8xfc2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.913455 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e97c0b2c-1294-43eb-a424-5c04e198611e-logs\") pod \"placement-db-sync-6x54r\" (UID: \"e97c0b2c-1294-43eb-a424-5c04e198611e\") " pod="openstack/placement-db-sync-6x54r" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.913489 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e97c0b2c-1294-43eb-a424-5c04e198611e-combined-ca-bundle\") pod \"placement-db-sync-6x54r\" (UID: \"e97c0b2c-1294-43eb-a424-5c04e198611e\") " pod="openstack/placement-db-sync-6x54r" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.913596 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-horizon-secret-key\") pod \"horizon-c55dfd787-8xfc2\" (UID: \"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb\") " pod="openstack/horizon-c55dfd787-8xfc2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.913622 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv9bp\" (UniqueName: \"kubernetes.io/projected/d71f9558-c417-4cc7-934f-258f388cced2-kube-api-access-pv9bp\") pod \"barbican-db-sync-f82cb\" (UID: \"d71f9558-c417-4cc7-934f-258f388cced2\") " pod="openstack/barbican-db-sync-f82cb" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.913655 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vss5g\" (UniqueName: \"kubernetes.io/projected/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-kube-api-access-vss5g\") pod \"horizon-c55dfd787-8xfc2\" (UID: \"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb\") " pod="openstack/horizon-c55dfd787-8xfc2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.913931 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-scripts\") pod \"horizon-c55dfd787-8xfc2\" (UID: \"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb\") " pod="openstack/horizon-c55dfd787-8xfc2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.913973 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71f9558-c417-4cc7-934f-258f388cced2-combined-ca-bundle\") pod \"barbican-db-sync-f82cb\" (UID: \"d71f9558-c417-4cc7-934f-258f388cced2\") " pod="openstack/barbican-db-sync-f82cb" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.914023 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e97c0b2c-1294-43eb-a424-5c04e198611e-config-data\") pod \"placement-db-sync-6x54r\" (UID: \"e97c0b2c-1294-43eb-a424-5c04e198611e\") " pod="openstack/placement-db-sync-6x54r" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.917316 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-config-data\") pod \"horizon-c55dfd787-8xfc2\" (UID: \"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb\") " pod="openstack/horizon-c55dfd787-8xfc2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.942280 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-logs\") pod \"horizon-c55dfd787-8xfc2\" (UID: \"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb\") " pod="openstack/horizon-c55dfd787-8xfc2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.950383 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-scripts\") pod \"horizon-c55dfd787-8xfc2\" (UID: \"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb\") " pod="openstack/horizon-c55dfd787-8xfc2" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.957986 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xsvx9" Dec 05 23:38:02 crc kubenswrapper[4734]: I1205 23:38:02.959215 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-x9hwd"] Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:02.999245 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-wjkcq" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.001745 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vss5g\" (UniqueName: \"kubernetes.io/projected/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-kube-api-access-vss5g\") pod \"horizon-c55dfd787-8xfc2\" (UID: \"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb\") " pod="openstack/horizon-c55dfd787-8xfc2" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.002578 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.012283 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-horizon-secret-key\") pod \"horizon-c55dfd787-8xfc2\" (UID: \"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb\") " pod="openstack/horizon-c55dfd787-8xfc2" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.038032 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e97c0b2c-1294-43eb-a424-5c04e198611e-scripts\") pod \"placement-db-sync-6x54r\" (UID: \"e97c0b2c-1294-43eb-a424-5c04e198611e\") " pod="openstack/placement-db-sync-6x54r" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.038089 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d71f9558-c417-4cc7-934f-258f388cced2-db-sync-config-data\") pod \"barbican-db-sync-f82cb\" (UID: \"d71f9558-c417-4cc7-934f-258f388cced2\") " pod="openstack/barbican-db-sync-f82cb" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.038162 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqtfj\" (UniqueName: \"kubernetes.io/projected/e97c0b2c-1294-43eb-a424-5c04e198611e-kube-api-access-xqtfj\") pod \"placement-db-sync-6x54r\" (UID: \"e97c0b2c-1294-43eb-a424-5c04e198611e\") " pod="openstack/placement-db-sync-6x54r" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.038193 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e97c0b2c-1294-43eb-a424-5c04e198611e-logs\") pod \"placement-db-sync-6x54r\" (UID: \"e97c0b2c-1294-43eb-a424-5c04e198611e\") " pod="openstack/placement-db-sync-6x54r" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.038221 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e97c0b2c-1294-43eb-a424-5c04e198611e-combined-ca-bundle\") pod \"placement-db-sync-6x54r\" (UID: \"e97c0b2c-1294-43eb-a424-5c04e198611e\") " pod="openstack/placement-db-sync-6x54r" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.038299 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv9bp\" (UniqueName: \"kubernetes.io/projected/d71f9558-c417-4cc7-934f-258f388cced2-kube-api-access-pv9bp\") pod \"barbican-db-sync-f82cb\" (UID: \"d71f9558-c417-4cc7-934f-258f388cced2\") " pod="openstack/barbican-db-sync-f82cb" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.038414 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71f9558-c417-4cc7-934f-258f388cced2-combined-ca-bundle\") pod \"barbican-db-sync-f82cb\" (UID: \"d71f9558-c417-4cc7-934f-258f388cced2\") " pod="openstack/barbican-db-sync-f82cb" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.038439 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e97c0b2c-1294-43eb-a424-5c04e198611e-config-data\") pod \"placement-db-sync-6x54r\" (UID: \"e97c0b2c-1294-43eb-a424-5c04e198611e\") " pod="openstack/placement-db-sync-6x54r" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.041877 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e97c0b2c-1294-43eb-a424-5c04e198611e-logs\") pod \"placement-db-sync-6x54r\" (UID: \"e97c0b2c-1294-43eb-a424-5c04e198611e\") " pod="openstack/placement-db-sync-6x54r" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.060651 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e97c0b2c-1294-43eb-a424-5c04e198611e-scripts\") pod \"placement-db-sync-6x54r\" (UID: \"e97c0b2c-1294-43eb-a424-5c04e198611e\") " pod="openstack/placement-db-sync-6x54r" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.064518 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71f9558-c417-4cc7-934f-258f388cced2-combined-ca-bundle\") pod \"barbican-db-sync-f82cb\" (UID: \"d71f9558-c417-4cc7-934f-258f388cced2\") " pod="openstack/barbican-db-sync-f82cb" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.076764 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d71f9558-c417-4cc7-934f-258f388cced2-db-sync-config-data\") pod \"barbican-db-sync-f82cb\" (UID: \"d71f9558-c417-4cc7-934f-258f388cced2\") " pod="openstack/barbican-db-sync-f82cb" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.082243 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e97c0b2c-1294-43eb-a424-5c04e198611e-config-data\") pod \"placement-db-sync-6x54r\" (UID: \"e97c0b2c-1294-43eb-a424-5c04e198611e\") " pod="openstack/placement-db-sync-6x54r" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.084138 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqtfj\" (UniqueName: \"kubernetes.io/projected/e97c0b2c-1294-43eb-a424-5c04e198611e-kube-api-access-xqtfj\") pod \"placement-db-sync-6x54r\" (UID: \"e97c0b2c-1294-43eb-a424-5c04e198611e\") " pod="openstack/placement-db-sync-6x54r" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.085798 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv9bp\" (UniqueName: \"kubernetes.io/projected/d71f9558-c417-4cc7-934f-258f388cced2-kube-api-access-pv9bp\") pod \"barbican-db-sync-f82cb\" (UID: \"d71f9558-c417-4cc7-934f-258f388cced2\") " pod="openstack/barbican-db-sync-f82cb" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.125775 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e97c0b2c-1294-43eb-a424-5c04e198611e-combined-ca-bundle\") pod \"placement-db-sync-6x54r\" (UID: \"e97c0b2c-1294-43eb-a424-5c04e198611e\") " pod="openstack/placement-db-sync-6x54r" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.141394 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5rsx\" (UniqueName: \"kubernetes.io/projected/70c2f508-fa86-4b19-96f2-4311b58ad70f-kube-api-access-t5rsx\") pod \"70c2f508-fa86-4b19-96f2-4311b58ad70f\" (UID: \"70c2f508-fa86-4b19-96f2-4311b58ad70f\") " Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.141585 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70c2f508-fa86-4b19-96f2-4311b58ad70f-dns-svc\") pod \"70c2f508-fa86-4b19-96f2-4311b58ad70f\" (UID: \"70c2f508-fa86-4b19-96f2-4311b58ad70f\") " Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.141671 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70c2f508-fa86-4b19-96f2-4311b58ad70f-ovsdbserver-sb\") pod \"70c2f508-fa86-4b19-96f2-4311b58ad70f\" (UID: \"70c2f508-fa86-4b19-96f2-4311b58ad70f\") " Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.141694 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70c2f508-fa86-4b19-96f2-4311b58ad70f-config\") pod \"70c2f508-fa86-4b19-96f2-4311b58ad70f\" (UID: \"70c2f508-fa86-4b19-96f2-4311b58ad70f\") " Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.141726 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70c2f508-fa86-4b19-96f2-4311b58ad70f-ovsdbserver-nb\") pod \"70c2f508-fa86-4b19-96f2-4311b58ad70f\" (UID: \"70c2f508-fa86-4b19-96f2-4311b58ad70f\") " Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.142143 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-x9hwd\" (UID: \"a2261d63-5689-409a-8395-c652e5c2960e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.142210 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-x9hwd\" (UID: \"a2261d63-5689-409a-8395-c652e5c2960e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.142254 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-config\") pod \"dnsmasq-dns-58dd9ff6bc-x9hwd\" (UID: \"a2261d63-5689-409a-8395-c652e5c2960e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.142302 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x55bq\" (UniqueName: \"kubernetes.io/projected/a2261d63-5689-409a-8395-c652e5c2960e-kube-api-access-x55bq\") pod \"dnsmasq-dns-58dd9ff6bc-x9hwd\" (UID: \"a2261d63-5689-409a-8395-c652e5c2960e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.142461 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-x9hwd\" (UID: \"a2261d63-5689-409a-8395-c652e5c2960e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.142500 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-x9hwd\" (UID: \"a2261d63-5689-409a-8395-c652e5c2960e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.142678 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70c2f508-fa86-4b19-96f2-4311b58ad70f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "70c2f508-fa86-4b19-96f2-4311b58ad70f" (UID: "70c2f508-fa86-4b19-96f2-4311b58ad70f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.143156 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70c2f508-fa86-4b19-96f2-4311b58ad70f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "70c2f508-fa86-4b19-96f2-4311b58ad70f" (UID: "70c2f508-fa86-4b19-96f2-4311b58ad70f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.143672 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70c2f508-fa86-4b19-96f2-4311b58ad70f-config" (OuterVolumeSpecName: "config") pod "70c2f508-fa86-4b19-96f2-4311b58ad70f" (UID: "70c2f508-fa86-4b19-96f2-4311b58ad70f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.144069 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70c2f508-fa86-4b19-96f2-4311b58ad70f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "70c2f508-fa86-4b19-96f2-4311b58ad70f" (UID: "70c2f508-fa86-4b19-96f2-4311b58ad70f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.144156 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-x9hwd"] Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.153346 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70c2f508-fa86-4b19-96f2-4311b58ad70f-kube-api-access-t5rsx" (OuterVolumeSpecName: "kube-api-access-t5rsx") pod "70c2f508-fa86-4b19-96f2-4311b58ad70f" (UID: "70c2f508-fa86-4b19-96f2-4311b58ad70f"). InnerVolumeSpecName "kube-api-access-t5rsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.236727 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c55dfd787-8xfc2" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.243954 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-x9hwd\" (UID: \"a2261d63-5689-409a-8395-c652e5c2960e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.244032 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-x9hwd\" (UID: \"a2261d63-5689-409a-8395-c652e5c2960e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.244064 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-config\") pod \"dnsmasq-dns-58dd9ff6bc-x9hwd\" (UID: \"a2261d63-5689-409a-8395-c652e5c2960e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.244100 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x55bq\" (UniqueName: \"kubernetes.io/projected/a2261d63-5689-409a-8395-c652e5c2960e-kube-api-access-x55bq\") pod \"dnsmasq-dns-58dd9ff6bc-x9hwd\" (UID: \"a2261d63-5689-409a-8395-c652e5c2960e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.244176 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-x9hwd\" (UID: \"a2261d63-5689-409a-8395-c652e5c2960e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.244200 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-x9hwd\" (UID: \"a2261d63-5689-409a-8395-c652e5c2960e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.244266 4734 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70c2f508-fa86-4b19-96f2-4311b58ad70f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.244278 4734 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70c2f508-fa86-4b19-96f2-4311b58ad70f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.244288 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70c2f508-fa86-4b19-96f2-4311b58ad70f-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.244297 4734 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70c2f508-fa86-4b19-96f2-4311b58ad70f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.244308 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5rsx\" (UniqueName: \"kubernetes.io/projected/70c2f508-fa86-4b19-96f2-4311b58ad70f-kube-api-access-t5rsx\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.245216 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-x9hwd\" (UID: \"a2261d63-5689-409a-8395-c652e5c2960e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.245247 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-config\") pod \"dnsmasq-dns-58dd9ff6bc-x9hwd\" (UID: \"a2261d63-5689-409a-8395-c652e5c2960e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.245819 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-x9hwd\" (UID: \"a2261d63-5689-409a-8395-c652e5c2960e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.246288 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-x9hwd\" (UID: \"a2261d63-5689-409a-8395-c652e5c2960e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.246425 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-x9hwd\" (UID: \"a2261d63-5689-409a-8395-c652e5c2960e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.271284 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x55bq\" (UniqueName: \"kubernetes.io/projected/a2261d63-5689-409a-8395-c652e5c2960e-kube-api-access-x55bq\") pod \"dnsmasq-dns-58dd9ff6bc-x9hwd\" (UID: \"a2261d63-5689-409a-8395-c652e5c2960e\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.297773 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6x54r" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.316810 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-f82cb" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.346765 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.353152 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-g8vv2"] Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.557278 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-769b7bcdf7-h7m8l"] Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.644846 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-wjkcq" Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.646406 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-769b7bcdf7-h7m8l" event={"ID":"b9dc8ed3-eab2-486b-9001-8ced7ac9ac22","Type":"ContainerStarted","Data":"3f81596039c8c3343db4a09330899db36e227b7a2f708e30745eccb0f18d88eb"} Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.646461 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-g8vv2" event={"ID":"d9db3688-e0c9-423a-9c15-406e359fec75","Type":"ContainerStarted","Data":"8369450afa2aad2b278ba8545ab04161ba5f7f73d5728856ae51e537df937399"} Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.715180 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-wjkcq"] Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.722675 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-wjkcq"] Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.752045 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-58fkh"] Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.767109 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:38:03 crc kubenswrapper[4734]: I1205 23:38:03.958699 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-xtrp6"] Dec 05 23:38:04 crc kubenswrapper[4734]: I1205 23:38:04.116895 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-c55dfd787-8xfc2"] Dec 05 23:38:04 crc kubenswrapper[4734]: I1205 23:38:04.126593 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-xsvx9"] Dec 05 23:38:04 crc kubenswrapper[4734]: I1205 23:38:04.145629 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-6x54r"] Dec 05 23:38:04 crc kubenswrapper[4734]: I1205 23:38:04.157127 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-f82cb"] Dec 05 23:38:04 crc kubenswrapper[4734]: I1205 23:38:04.190509 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-x9hwd"] Dec 05 23:38:04 crc kubenswrapper[4734]: I1205 23:38:04.669867 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6x54r" event={"ID":"e97c0b2c-1294-43eb-a424-5c04e198611e","Type":"ContainerStarted","Data":"d90e61e0737d359b9174289e510067cdf3025e9e63d6c3410087a8753bc8b23b"} Dec 05 23:38:04 crc kubenswrapper[4734]: I1205 23:38:04.676120 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c55dfd787-8xfc2" event={"ID":"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb","Type":"ContainerStarted","Data":"50f36b899675afeb3c21ab167df5e282d91da6260923c9a33928cfd04b0d501d"} Dec 05 23:38:04 crc kubenswrapper[4734]: I1205 23:38:04.680314 4734 generic.go:334] "Generic (PLEG): container finished" podID="d0d1c6ca-ba99-4296-b215-d985a6cd3402" containerID="5e9a73f65abba6fcb4e012384278746331c7b24ac21ae41adf03912d8e61b730" exitCode=0 Dec 05 23:38:04 crc kubenswrapper[4734]: I1205 23:38:04.680426 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-xtrp6" event={"ID":"d0d1c6ca-ba99-4296-b215-d985a6cd3402","Type":"ContainerDied","Data":"5e9a73f65abba6fcb4e012384278746331c7b24ac21ae41adf03912d8e61b730"} Dec 05 23:38:04 crc kubenswrapper[4734]: I1205 23:38:04.680510 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-xtrp6" event={"ID":"d0d1c6ca-ba99-4296-b215-d985a6cd3402","Type":"ContainerStarted","Data":"f6b1103a727414d97f7525a1b949c058cc4f239eccb42686a4485cd180f5f617"} Dec 05 23:38:04 crc kubenswrapper[4734]: I1205 23:38:04.692813 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"067f11aa-41d5-4a34-9f2e-33b35981e9ba","Type":"ContainerStarted","Data":"702d6a05c4f6e9c8105d0d8906c578fd4fc0499f2201ec6262e0ec44d5713d78"} Dec 05 23:38:04 crc kubenswrapper[4734]: I1205 23:38:04.750163 4734 generic.go:334] "Generic (PLEG): container finished" podID="a2261d63-5689-409a-8395-c652e5c2960e" containerID="d7c956a71ae0f76861b7425ac025c57aae06ff8d8157384091bb5e0ba0e2f96e" exitCode=0 Dec 05 23:38:04 crc kubenswrapper[4734]: I1205 23:38:04.750314 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" event={"ID":"a2261d63-5689-409a-8395-c652e5c2960e","Type":"ContainerDied","Data":"d7c956a71ae0f76861b7425ac025c57aae06ff8d8157384091bb5e0ba0e2f96e"} Dec 05 23:38:04 crc kubenswrapper[4734]: I1205 23:38:04.750361 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" event={"ID":"a2261d63-5689-409a-8395-c652e5c2960e","Type":"ContainerStarted","Data":"378d6dbac9bf0e69575ee3921a9f752a6216057b4d6de30acef4fcef53feae46"} Dec 05 23:38:04 crc kubenswrapper[4734]: I1205 23:38:04.772463 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xsvx9" event={"ID":"4c26d17f-e341-41c5-9759-c0b265fcceea","Type":"ContainerStarted","Data":"48d3586a98696ec0f0c4d981bd7f858c4ce00e5979652fad7b1e49dcc55d061e"} Dec 05 23:38:04 crc kubenswrapper[4734]: I1205 23:38:04.791132 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-58fkh" event={"ID":"6c427c6a-2e27-4e8d-9088-1cdad55da769","Type":"ContainerStarted","Data":"adea1b041e1d7c952fad653f11acbe0d9b04cc0b8533443a7ec38aa9b962d3bf"} Dec 05 23:38:04 crc kubenswrapper[4734]: I1205 23:38:04.791213 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-58fkh" event={"ID":"6c427c6a-2e27-4e8d-9088-1cdad55da769","Type":"ContainerStarted","Data":"231ea6db99fa0c7726b777738208814111587ad3583a8ae90bb8291d6897245e"} Dec 05 23:38:04 crc kubenswrapper[4734]: I1205 23:38:04.841176 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-g8vv2" event={"ID":"d9db3688-e0c9-423a-9c15-406e359fec75","Type":"ContainerStarted","Data":"c84a2d92f1ff7d0be99abd71752073d96b69b8ea001fe342c500401c23b2e406"} Dec 05 23:38:04 crc kubenswrapper[4734]: I1205 23:38:04.884671 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-f82cb" event={"ID":"d71f9558-c417-4cc7-934f-258f388cced2","Type":"ContainerStarted","Data":"97eb2b3c326f18eb0c102d9aa55d35921ea87086744685d6726ecadc78607af8"} Dec 05 23:38:04 crc kubenswrapper[4734]: I1205 23:38:04.886115 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-58fkh" podStartSLOduration=2.8861021989999998 podStartE2EDuration="2.886102199s" podCreationTimestamp="2025-12-05 23:38:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:38:04.884560462 +0000 UTC m=+1105.567964748" watchObservedRunningTime="2025-12-05 23:38:04.886102199 +0000 UTC m=+1105.569506475" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.184519 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-g8vv2" podStartSLOduration=4.184487938 podStartE2EDuration="4.184487938s" podCreationTimestamp="2025-12-05 23:38:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:38:04.948830979 +0000 UTC m=+1105.632235255" watchObservedRunningTime="2025-12-05 23:38:05.184487938 +0000 UTC m=+1105.867892214" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.243739 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-769b7bcdf7-h7m8l"] Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.274925 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.301348 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7ddfc99ccc-pn7dw"] Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.304606 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7ddfc99ccc-pn7dw" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.329342 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7ddfc99ccc-pn7dw"] Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.390865 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-xtrp6" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.424609 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-horizon-secret-key\") pod \"horizon-7ddfc99ccc-pn7dw\" (UID: \"904ca7a9-2b5c-4636-bfeb-f98ea4981e2f\") " pod="openstack/horizon-7ddfc99ccc-pn7dw" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.424662 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-logs\") pod \"horizon-7ddfc99ccc-pn7dw\" (UID: \"904ca7a9-2b5c-4636-bfeb-f98ea4981e2f\") " pod="openstack/horizon-7ddfc99ccc-pn7dw" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.424839 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-config-data\") pod \"horizon-7ddfc99ccc-pn7dw\" (UID: \"904ca7a9-2b5c-4636-bfeb-f98ea4981e2f\") " pod="openstack/horizon-7ddfc99ccc-pn7dw" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.425908 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpdkc\" (UniqueName: \"kubernetes.io/projected/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-kube-api-access-zpdkc\") pod \"horizon-7ddfc99ccc-pn7dw\" (UID: \"904ca7a9-2b5c-4636-bfeb-f98ea4981e2f\") " pod="openstack/horizon-7ddfc99ccc-pn7dw" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.426124 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-scripts\") pod \"horizon-7ddfc99ccc-pn7dw\" (UID: \"904ca7a9-2b5c-4636-bfeb-f98ea4981e2f\") " pod="openstack/horizon-7ddfc99ccc-pn7dw" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.527380 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-ovsdbserver-nb\") pod \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\" (UID: \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\") " Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.527635 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-ovsdbserver-sb\") pod \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\" (UID: \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\") " Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.527737 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-dns-swift-storage-0\") pod \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\" (UID: \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\") " Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.527807 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-config\") pod \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\" (UID: \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\") " Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.527908 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5px8w\" (UniqueName: \"kubernetes.io/projected/d0d1c6ca-ba99-4296-b215-d985a6cd3402-kube-api-access-5px8w\") pod \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\" (UID: \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\") " Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.527960 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-dns-svc\") pod \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\" (UID: \"d0d1c6ca-ba99-4296-b215-d985a6cd3402\") " Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.528382 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpdkc\" (UniqueName: \"kubernetes.io/projected/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-kube-api-access-zpdkc\") pod \"horizon-7ddfc99ccc-pn7dw\" (UID: \"904ca7a9-2b5c-4636-bfeb-f98ea4981e2f\") " pod="openstack/horizon-7ddfc99ccc-pn7dw" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.528415 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-scripts\") pod \"horizon-7ddfc99ccc-pn7dw\" (UID: \"904ca7a9-2b5c-4636-bfeb-f98ea4981e2f\") " pod="openstack/horizon-7ddfc99ccc-pn7dw" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.528454 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-horizon-secret-key\") pod \"horizon-7ddfc99ccc-pn7dw\" (UID: \"904ca7a9-2b5c-4636-bfeb-f98ea4981e2f\") " pod="openstack/horizon-7ddfc99ccc-pn7dw" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.528470 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-logs\") pod \"horizon-7ddfc99ccc-pn7dw\" (UID: \"904ca7a9-2b5c-4636-bfeb-f98ea4981e2f\") " pod="openstack/horizon-7ddfc99ccc-pn7dw" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.528487 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-config-data\") pod \"horizon-7ddfc99ccc-pn7dw\" (UID: \"904ca7a9-2b5c-4636-bfeb-f98ea4981e2f\") " pod="openstack/horizon-7ddfc99ccc-pn7dw" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.530189 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-logs\") pod \"horizon-7ddfc99ccc-pn7dw\" (UID: \"904ca7a9-2b5c-4636-bfeb-f98ea4981e2f\") " pod="openstack/horizon-7ddfc99ccc-pn7dw" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.536235 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-scripts\") pod \"horizon-7ddfc99ccc-pn7dw\" (UID: \"904ca7a9-2b5c-4636-bfeb-f98ea4981e2f\") " pod="openstack/horizon-7ddfc99ccc-pn7dw" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.539292 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-config-data\") pod \"horizon-7ddfc99ccc-pn7dw\" (UID: \"904ca7a9-2b5c-4636-bfeb-f98ea4981e2f\") " pod="openstack/horizon-7ddfc99ccc-pn7dw" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.553932 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0d1c6ca-ba99-4296-b215-d985a6cd3402-kube-api-access-5px8w" (OuterVolumeSpecName: "kube-api-access-5px8w") pod "d0d1c6ca-ba99-4296-b215-d985a6cd3402" (UID: "d0d1c6ca-ba99-4296-b215-d985a6cd3402"). InnerVolumeSpecName "kube-api-access-5px8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.555197 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpdkc\" (UniqueName: \"kubernetes.io/projected/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-kube-api-access-zpdkc\") pod \"horizon-7ddfc99ccc-pn7dw\" (UID: \"904ca7a9-2b5c-4636-bfeb-f98ea4981e2f\") " pod="openstack/horizon-7ddfc99ccc-pn7dw" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.567900 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-horizon-secret-key\") pod \"horizon-7ddfc99ccc-pn7dw\" (UID: \"904ca7a9-2b5c-4636-bfeb-f98ea4981e2f\") " pod="openstack/horizon-7ddfc99ccc-pn7dw" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.583788 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d0d1c6ca-ba99-4296-b215-d985a6cd3402" (UID: "d0d1c6ca-ba99-4296-b215-d985a6cd3402"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.586098 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-config" (OuterVolumeSpecName: "config") pod "d0d1c6ca-ba99-4296-b215-d985a6cd3402" (UID: "d0d1c6ca-ba99-4296-b215-d985a6cd3402"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.593240 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d0d1c6ca-ba99-4296-b215-d985a6cd3402" (UID: "d0d1c6ca-ba99-4296-b215-d985a6cd3402"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.594216 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d0d1c6ca-ba99-4296-b215-d985a6cd3402" (UID: "d0d1c6ca-ba99-4296-b215-d985a6cd3402"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.598287 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d0d1c6ca-ba99-4296-b215-d985a6cd3402" (UID: "d0d1c6ca-ba99-4296-b215-d985a6cd3402"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.630977 4734 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.631041 4734 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.631055 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.631065 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5px8w\" (UniqueName: \"kubernetes.io/projected/d0d1c6ca-ba99-4296-b215-d985a6cd3402-kube-api-access-5px8w\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.631077 4734 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.631084 4734 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0d1c6ca-ba99-4296-b215-d985a6cd3402-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.679257 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7ddfc99ccc-pn7dw" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.691452 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70c2f508-fa86-4b19-96f2-4311b58ad70f" path="/var/lib/kubelet/pods/70c2f508-fa86-4b19-96f2-4311b58ad70f/volumes" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.920765 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-xtrp6" event={"ID":"d0d1c6ca-ba99-4296-b215-d985a6cd3402","Type":"ContainerDied","Data":"f6b1103a727414d97f7525a1b949c058cc4f239eccb42686a4485cd180f5f617"} Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.920850 4734 scope.go:117] "RemoveContainer" containerID="5e9a73f65abba6fcb4e012384278746331c7b24ac21ae41adf03912d8e61b730" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.920845 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-xtrp6" Dec 05 23:38:05 crc kubenswrapper[4734]: I1205 23:38:05.934681 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" event={"ID":"a2261d63-5689-409a-8395-c652e5c2960e","Type":"ContainerStarted","Data":"d8398c8e8e80acadc385209acc0873aba343061e9edc9f33a2e8dcc5787ccc78"} Dec 05 23:38:06 crc kubenswrapper[4734]: I1205 23:38:06.028488 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-xtrp6"] Dec 05 23:38:06 crc kubenswrapper[4734]: I1205 23:38:06.040958 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-xtrp6"] Dec 05 23:38:06 crc kubenswrapper[4734]: I1205 23:38:06.044515 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" podStartSLOduration=4.044484514 podStartE2EDuration="4.044484514s" podCreationTimestamp="2025-12-05 23:38:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:38:06.041464051 +0000 UTC m=+1106.724868327" watchObservedRunningTime="2025-12-05 23:38:06.044484514 +0000 UTC m=+1106.727888790" Dec 05 23:38:06 crc kubenswrapper[4734]: I1205 23:38:06.470356 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7ddfc99ccc-pn7dw"] Dec 05 23:38:06 crc kubenswrapper[4734]: W1205 23:38:06.502730 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod904ca7a9_2b5c_4636_bfeb_f98ea4981e2f.slice/crio-f76ee2e0d5b2f58fe7a2de45df3803f0bb3bc6b0d9f9a3277ad4c6c089cbef58 WatchSource:0}: Error finding container f76ee2e0d5b2f58fe7a2de45df3803f0bb3bc6b0d9f9a3277ad4c6c089cbef58: Status 404 returned error can't find the container with id f76ee2e0d5b2f58fe7a2de45df3803f0bb3bc6b0d9f9a3277ad4c6c089cbef58 Dec 05 23:38:06 crc kubenswrapper[4734]: I1205 23:38:06.968247 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7ddfc99ccc-pn7dw" event={"ID":"904ca7a9-2b5c-4636-bfeb-f98ea4981e2f","Type":"ContainerStarted","Data":"f76ee2e0d5b2f58fe7a2de45df3803f0bb3bc6b0d9f9a3277ad4c6c089cbef58"} Dec 05 23:38:06 crc kubenswrapper[4734]: I1205 23:38:06.977515 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" Dec 05 23:38:07 crc kubenswrapper[4734]: I1205 23:38:07.628364 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0d1c6ca-ba99-4296-b215-d985a6cd3402" path="/var/lib/kubelet/pods/d0d1c6ca-ba99-4296-b215-d985a6cd3402/volumes" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.505354 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-c55dfd787-8xfc2"] Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.558346 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5d469948dd-n7t4x"] Dec 05 23:38:11 crc kubenswrapper[4734]: E1205 23:38:11.559150 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0d1c6ca-ba99-4296-b215-d985a6cd3402" containerName="init" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.559178 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0d1c6ca-ba99-4296-b215-d985a6cd3402" containerName="init" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.559414 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0d1c6ca-ba99-4296-b215-d985a6cd3402" containerName="init" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.560662 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.566637 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.579829 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d469948dd-n7t4x"] Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.643289 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7ddfc99ccc-pn7dw"] Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.675797 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-755fc898d8-dlnbz"] Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.677779 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-755fc898d8-dlnbz" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.691286 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-755fc898d8-dlnbz"] Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.715302 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c96cd173-4707-4edc-a92e-35db297082e2-logs\") pod \"horizon-5d469948dd-n7t4x\" (UID: \"c96cd173-4707-4edc-a92e-35db297082e2\") " pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.715396 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c96cd173-4707-4edc-a92e-35db297082e2-combined-ca-bundle\") pod \"horizon-5d469948dd-n7t4x\" (UID: \"c96cd173-4707-4edc-a92e-35db297082e2\") " pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.715435 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c96cd173-4707-4edc-a92e-35db297082e2-config-data\") pod \"horizon-5d469948dd-n7t4x\" (UID: \"c96cd173-4707-4edc-a92e-35db297082e2\") " pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.715622 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c96cd173-4707-4edc-a92e-35db297082e2-horizon-tls-certs\") pod \"horizon-5d469948dd-n7t4x\" (UID: \"c96cd173-4707-4edc-a92e-35db297082e2\") " pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.715652 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c96cd173-4707-4edc-a92e-35db297082e2-horizon-secret-key\") pod \"horizon-5d469948dd-n7t4x\" (UID: \"c96cd173-4707-4edc-a92e-35db297082e2\") " pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.715706 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c96cd173-4707-4edc-a92e-35db297082e2-scripts\") pod \"horizon-5d469948dd-n7t4x\" (UID: \"c96cd173-4707-4edc-a92e-35db297082e2\") " pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.715725 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw48z\" (UniqueName: \"kubernetes.io/projected/c96cd173-4707-4edc-a92e-35db297082e2-kube-api-access-rw48z\") pod \"horizon-5d469948dd-n7t4x\" (UID: \"c96cd173-4707-4edc-a92e-35db297082e2\") " pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.818230 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c96cd173-4707-4edc-a92e-35db297082e2-combined-ca-bundle\") pod \"horizon-5d469948dd-n7t4x\" (UID: \"c96cd173-4707-4edc-a92e-35db297082e2\") " pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.818287 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c96cd173-4707-4edc-a92e-35db297082e2-config-data\") pod \"horizon-5d469948dd-n7t4x\" (UID: \"c96cd173-4707-4edc-a92e-35db297082e2\") " pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.818479 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbcbbde9-55c9-48dc-866d-ab670775e9b3-combined-ca-bundle\") pod \"horizon-755fc898d8-dlnbz\" (UID: \"bbcbbde9-55c9-48dc-866d-ab670775e9b3\") " pod="openstack/horizon-755fc898d8-dlnbz" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.818571 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bbcbbde9-55c9-48dc-866d-ab670775e9b3-config-data\") pod \"horizon-755fc898d8-dlnbz\" (UID: \"bbcbbde9-55c9-48dc-866d-ab670775e9b3\") " pod="openstack/horizon-755fc898d8-dlnbz" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.818612 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbcbbde9-55c9-48dc-866d-ab670775e9b3-horizon-tls-certs\") pod \"horizon-755fc898d8-dlnbz\" (UID: \"bbcbbde9-55c9-48dc-866d-ab670775e9b3\") " pod="openstack/horizon-755fc898d8-dlnbz" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.818655 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c96cd173-4707-4edc-a92e-35db297082e2-horizon-tls-certs\") pod \"horizon-5d469948dd-n7t4x\" (UID: \"c96cd173-4707-4edc-a92e-35db297082e2\") " pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.818687 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bbcbbde9-55c9-48dc-866d-ab670775e9b3-scripts\") pod \"horizon-755fc898d8-dlnbz\" (UID: \"bbcbbde9-55c9-48dc-866d-ab670775e9b3\") " pod="openstack/horizon-755fc898d8-dlnbz" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.818727 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c96cd173-4707-4edc-a92e-35db297082e2-horizon-secret-key\") pod \"horizon-5d469948dd-n7t4x\" (UID: \"c96cd173-4707-4edc-a92e-35db297082e2\") " pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.818762 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bbcbbde9-55c9-48dc-866d-ab670775e9b3-horizon-secret-key\") pod \"horizon-755fc898d8-dlnbz\" (UID: \"bbcbbde9-55c9-48dc-866d-ab670775e9b3\") " pod="openstack/horizon-755fc898d8-dlnbz" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.818820 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c96cd173-4707-4edc-a92e-35db297082e2-scripts\") pod \"horizon-5d469948dd-n7t4x\" (UID: \"c96cd173-4707-4edc-a92e-35db297082e2\") " pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.818843 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw48z\" (UniqueName: \"kubernetes.io/projected/c96cd173-4707-4edc-a92e-35db297082e2-kube-api-access-rw48z\") pod \"horizon-5d469948dd-n7t4x\" (UID: \"c96cd173-4707-4edc-a92e-35db297082e2\") " pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.818869 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbcbbde9-55c9-48dc-866d-ab670775e9b3-logs\") pod \"horizon-755fc898d8-dlnbz\" (UID: \"bbcbbde9-55c9-48dc-866d-ab670775e9b3\") " pod="openstack/horizon-755fc898d8-dlnbz" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.818915 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c96cd173-4707-4edc-a92e-35db297082e2-logs\") pod \"horizon-5d469948dd-n7t4x\" (UID: \"c96cd173-4707-4edc-a92e-35db297082e2\") " pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.818944 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbwsw\" (UniqueName: \"kubernetes.io/projected/bbcbbde9-55c9-48dc-866d-ab670775e9b3-kube-api-access-zbwsw\") pod \"horizon-755fc898d8-dlnbz\" (UID: \"bbcbbde9-55c9-48dc-866d-ab670775e9b3\") " pod="openstack/horizon-755fc898d8-dlnbz" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.821737 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c96cd173-4707-4edc-a92e-35db297082e2-config-data\") pod \"horizon-5d469948dd-n7t4x\" (UID: \"c96cd173-4707-4edc-a92e-35db297082e2\") " pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.822396 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c96cd173-4707-4edc-a92e-35db297082e2-scripts\") pod \"horizon-5d469948dd-n7t4x\" (UID: \"c96cd173-4707-4edc-a92e-35db297082e2\") " pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.824841 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c96cd173-4707-4edc-a92e-35db297082e2-logs\") pod \"horizon-5d469948dd-n7t4x\" (UID: \"c96cd173-4707-4edc-a92e-35db297082e2\") " pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.826673 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c96cd173-4707-4edc-a92e-35db297082e2-horizon-secret-key\") pod \"horizon-5d469948dd-n7t4x\" (UID: \"c96cd173-4707-4edc-a92e-35db297082e2\") " pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.829207 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c96cd173-4707-4edc-a92e-35db297082e2-combined-ca-bundle\") pod \"horizon-5d469948dd-n7t4x\" (UID: \"c96cd173-4707-4edc-a92e-35db297082e2\") " pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.841549 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c96cd173-4707-4edc-a92e-35db297082e2-horizon-tls-certs\") pod \"horizon-5d469948dd-n7t4x\" (UID: \"c96cd173-4707-4edc-a92e-35db297082e2\") " pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.845609 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw48z\" (UniqueName: \"kubernetes.io/projected/c96cd173-4707-4edc-a92e-35db297082e2-kube-api-access-rw48z\") pod \"horizon-5d469948dd-n7t4x\" (UID: \"c96cd173-4707-4edc-a92e-35db297082e2\") " pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.897815 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.920863 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbcbbde9-55c9-48dc-866d-ab670775e9b3-logs\") pod \"horizon-755fc898d8-dlnbz\" (UID: \"bbcbbde9-55c9-48dc-866d-ab670775e9b3\") " pod="openstack/horizon-755fc898d8-dlnbz" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.920950 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbwsw\" (UniqueName: \"kubernetes.io/projected/bbcbbde9-55c9-48dc-866d-ab670775e9b3-kube-api-access-zbwsw\") pod \"horizon-755fc898d8-dlnbz\" (UID: \"bbcbbde9-55c9-48dc-866d-ab670775e9b3\") " pod="openstack/horizon-755fc898d8-dlnbz" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.921027 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbcbbde9-55c9-48dc-866d-ab670775e9b3-combined-ca-bundle\") pod \"horizon-755fc898d8-dlnbz\" (UID: \"bbcbbde9-55c9-48dc-866d-ab670775e9b3\") " pod="openstack/horizon-755fc898d8-dlnbz" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.921062 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bbcbbde9-55c9-48dc-866d-ab670775e9b3-config-data\") pod \"horizon-755fc898d8-dlnbz\" (UID: \"bbcbbde9-55c9-48dc-866d-ab670775e9b3\") " pod="openstack/horizon-755fc898d8-dlnbz" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.921091 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbcbbde9-55c9-48dc-866d-ab670775e9b3-horizon-tls-certs\") pod \"horizon-755fc898d8-dlnbz\" (UID: \"bbcbbde9-55c9-48dc-866d-ab670775e9b3\") " pod="openstack/horizon-755fc898d8-dlnbz" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.921119 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bbcbbde9-55c9-48dc-866d-ab670775e9b3-scripts\") pod \"horizon-755fc898d8-dlnbz\" (UID: \"bbcbbde9-55c9-48dc-866d-ab670775e9b3\") " pod="openstack/horizon-755fc898d8-dlnbz" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.921147 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bbcbbde9-55c9-48dc-866d-ab670775e9b3-horizon-secret-key\") pod \"horizon-755fc898d8-dlnbz\" (UID: \"bbcbbde9-55c9-48dc-866d-ab670775e9b3\") " pod="openstack/horizon-755fc898d8-dlnbz" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.922209 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbcbbde9-55c9-48dc-866d-ab670775e9b3-logs\") pod \"horizon-755fc898d8-dlnbz\" (UID: \"bbcbbde9-55c9-48dc-866d-ab670775e9b3\") " pod="openstack/horizon-755fc898d8-dlnbz" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.923239 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bbcbbde9-55c9-48dc-866d-ab670775e9b3-config-data\") pod \"horizon-755fc898d8-dlnbz\" (UID: \"bbcbbde9-55c9-48dc-866d-ab670775e9b3\") " pod="openstack/horizon-755fc898d8-dlnbz" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.923739 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bbcbbde9-55c9-48dc-866d-ab670775e9b3-scripts\") pod \"horizon-755fc898d8-dlnbz\" (UID: \"bbcbbde9-55c9-48dc-866d-ab670775e9b3\") " pod="openstack/horizon-755fc898d8-dlnbz" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.927588 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bbcbbde9-55c9-48dc-866d-ab670775e9b3-horizon-secret-key\") pod \"horizon-755fc898d8-dlnbz\" (UID: \"bbcbbde9-55c9-48dc-866d-ab670775e9b3\") " pod="openstack/horizon-755fc898d8-dlnbz" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.928289 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbcbbde9-55c9-48dc-866d-ab670775e9b3-combined-ca-bundle\") pod \"horizon-755fc898d8-dlnbz\" (UID: \"bbcbbde9-55c9-48dc-866d-ab670775e9b3\") " pod="openstack/horizon-755fc898d8-dlnbz" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.929602 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbcbbde9-55c9-48dc-866d-ab670775e9b3-horizon-tls-certs\") pod \"horizon-755fc898d8-dlnbz\" (UID: \"bbcbbde9-55c9-48dc-866d-ab670775e9b3\") " pod="openstack/horizon-755fc898d8-dlnbz" Dec 05 23:38:11 crc kubenswrapper[4734]: I1205 23:38:11.943806 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbwsw\" (UniqueName: \"kubernetes.io/projected/bbcbbde9-55c9-48dc-866d-ab670775e9b3-kube-api-access-zbwsw\") pod \"horizon-755fc898d8-dlnbz\" (UID: \"bbcbbde9-55c9-48dc-866d-ab670775e9b3\") " pod="openstack/horizon-755fc898d8-dlnbz" Dec 05 23:38:12 crc kubenswrapper[4734]: I1205 23:38:12.005693 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-755fc898d8-dlnbz" Dec 05 23:38:12 crc kubenswrapper[4734]: I1205 23:38:12.061895 4734 generic.go:334] "Generic (PLEG): container finished" podID="741e9328-bc42-4fae-b3dd-316f3286fa42" containerID="d573c95c457ec42a3ba6fba3952780a656918af85cc999b56e5043e47c97c283" exitCode=0 Dec 05 23:38:12 crc kubenswrapper[4734]: I1205 23:38:12.061967 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wzn74" event={"ID":"741e9328-bc42-4fae-b3dd-316f3286fa42","Type":"ContainerDied","Data":"d573c95c457ec42a3ba6fba3952780a656918af85cc999b56e5043e47c97c283"} Dec 05 23:38:12 crc kubenswrapper[4734]: I1205 23:38:12.066921 4734 generic.go:334] "Generic (PLEG): container finished" podID="d9db3688-e0c9-423a-9c15-406e359fec75" containerID="c84a2d92f1ff7d0be99abd71752073d96b69b8ea001fe342c500401c23b2e406" exitCode=0 Dec 05 23:38:12 crc kubenswrapper[4734]: I1205 23:38:12.066961 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-g8vv2" event={"ID":"d9db3688-e0c9-423a-9c15-406e359fec75","Type":"ContainerDied","Data":"c84a2d92f1ff7d0be99abd71752073d96b69b8ea001fe342c500401c23b2e406"} Dec 05 23:38:13 crc kubenswrapper[4734]: I1205 23:38:13.349797 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" Dec 05 23:38:13 crc kubenswrapper[4734]: I1205 23:38:13.421024 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-l7vfg"] Dec 05 23:38:13 crc kubenswrapper[4734]: I1205 23:38:13.421519 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-l7vfg" podUID="e0c9ffcb-625f-49f8-869a-e71e5f53b92b" containerName="dnsmasq-dns" containerID="cri-o://d3435168d1c4d378de9e64845e024fdbc2764b4dbac12d6383881b6a4ac18b30" gracePeriod=10 Dec 05 23:38:14 crc kubenswrapper[4734]: I1205 23:38:14.094779 4734 generic.go:334] "Generic (PLEG): container finished" podID="e0c9ffcb-625f-49f8-869a-e71e5f53b92b" containerID="d3435168d1c4d378de9e64845e024fdbc2764b4dbac12d6383881b6a4ac18b30" exitCode=0 Dec 05 23:38:14 crc kubenswrapper[4734]: I1205 23:38:14.094850 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-l7vfg" event={"ID":"e0c9ffcb-625f-49f8-869a-e71e5f53b92b","Type":"ContainerDied","Data":"d3435168d1c4d378de9e64845e024fdbc2764b4dbac12d6383881b6a4ac18b30"} Dec 05 23:38:16 crc kubenswrapper[4734]: I1205 23:38:16.354246 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-l7vfg" podUID="e0c9ffcb-625f-49f8-869a-e71e5f53b92b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.116:5353: connect: connection refused" Dec 05 23:38:18 crc kubenswrapper[4734]: I1205 23:38:18.441923 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-g8vv2" Dec 05 23:38:18 crc kubenswrapper[4734]: I1205 23:38:18.508494 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-fernet-keys\") pod \"d9db3688-e0c9-423a-9c15-406e359fec75\" (UID: \"d9db3688-e0c9-423a-9c15-406e359fec75\") " Dec 05 23:38:18 crc kubenswrapper[4734]: I1205 23:38:18.508643 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-combined-ca-bundle\") pod \"d9db3688-e0c9-423a-9c15-406e359fec75\" (UID: \"d9db3688-e0c9-423a-9c15-406e359fec75\") " Dec 05 23:38:18 crc kubenswrapper[4734]: I1205 23:38:18.508763 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-scripts\") pod \"d9db3688-e0c9-423a-9c15-406e359fec75\" (UID: \"d9db3688-e0c9-423a-9c15-406e359fec75\") " Dec 05 23:38:18 crc kubenswrapper[4734]: I1205 23:38:18.508802 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-config-data\") pod \"d9db3688-e0c9-423a-9c15-406e359fec75\" (UID: \"d9db3688-e0c9-423a-9c15-406e359fec75\") " Dec 05 23:38:18 crc kubenswrapper[4734]: I1205 23:38:18.508839 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f58q8\" (UniqueName: \"kubernetes.io/projected/d9db3688-e0c9-423a-9c15-406e359fec75-kube-api-access-f58q8\") pod \"d9db3688-e0c9-423a-9c15-406e359fec75\" (UID: \"d9db3688-e0c9-423a-9c15-406e359fec75\") " Dec 05 23:38:18 crc kubenswrapper[4734]: I1205 23:38:18.508984 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-credential-keys\") pod \"d9db3688-e0c9-423a-9c15-406e359fec75\" (UID: \"d9db3688-e0c9-423a-9c15-406e359fec75\") " Dec 05 23:38:18 crc kubenswrapper[4734]: I1205 23:38:18.516837 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d9db3688-e0c9-423a-9c15-406e359fec75" (UID: "d9db3688-e0c9-423a-9c15-406e359fec75"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:38:18 crc kubenswrapper[4734]: I1205 23:38:18.517373 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d9db3688-e0c9-423a-9c15-406e359fec75" (UID: "d9db3688-e0c9-423a-9c15-406e359fec75"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:38:18 crc kubenswrapper[4734]: I1205 23:38:18.519682 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9db3688-e0c9-423a-9c15-406e359fec75-kube-api-access-f58q8" (OuterVolumeSpecName: "kube-api-access-f58q8") pod "d9db3688-e0c9-423a-9c15-406e359fec75" (UID: "d9db3688-e0c9-423a-9c15-406e359fec75"). InnerVolumeSpecName "kube-api-access-f58q8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:38:18 crc kubenswrapper[4734]: I1205 23:38:18.528836 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-scripts" (OuterVolumeSpecName: "scripts") pod "d9db3688-e0c9-423a-9c15-406e359fec75" (UID: "d9db3688-e0c9-423a-9c15-406e359fec75"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:38:18 crc kubenswrapper[4734]: I1205 23:38:18.548286 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9db3688-e0c9-423a-9c15-406e359fec75" (UID: "d9db3688-e0c9-423a-9c15-406e359fec75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:38:18 crc kubenswrapper[4734]: I1205 23:38:18.554592 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-config-data" (OuterVolumeSpecName: "config-data") pod "d9db3688-e0c9-423a-9c15-406e359fec75" (UID: "d9db3688-e0c9-423a-9c15-406e359fec75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:38:18 crc kubenswrapper[4734]: I1205 23:38:18.611647 4734 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:18 crc kubenswrapper[4734]: I1205 23:38:18.611711 4734 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:18 crc kubenswrapper[4734]: I1205 23:38:18.611722 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:18 crc kubenswrapper[4734]: I1205 23:38:18.611733 4734 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:18 crc kubenswrapper[4734]: I1205 23:38:18.611742 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9db3688-e0c9-423a-9c15-406e359fec75-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:18 crc kubenswrapper[4734]: I1205 23:38:18.611751 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f58q8\" (UniqueName: \"kubernetes.io/projected/d9db3688-e0c9-423a-9c15-406e359fec75-kube-api-access-f58q8\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.155059 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-g8vv2" event={"ID":"d9db3688-e0c9-423a-9c15-406e359fec75","Type":"ContainerDied","Data":"8369450afa2aad2b278ba8545ab04161ba5f7f73d5728856ae51e537df937399"} Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.155126 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8369450afa2aad2b278ba8545ab04161ba5f7f73d5728856ae51e537df937399" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.155137 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-g8vv2" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.645665 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-g8vv2"] Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.658012 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-g8vv2"] Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.756250 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-zhc67"] Dec 05 23:38:19 crc kubenswrapper[4734]: E1205 23:38:19.756923 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9db3688-e0c9-423a-9c15-406e359fec75" containerName="keystone-bootstrap" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.756944 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9db3688-e0c9-423a-9c15-406e359fec75" containerName="keystone-bootstrap" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.757133 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9db3688-e0c9-423a-9c15-406e359fec75" containerName="keystone-bootstrap" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.757966 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zhc67" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.768385 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-zhc67"] Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.788632 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qf7c2" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.788879 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.789076 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.789206 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.789339 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.844251 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-combined-ca-bundle\") pod \"keystone-bootstrap-zhc67\" (UID: \"df84fed8-d899-47ed-a702-2fbae2f75d53\") " pod="openstack/keystone-bootstrap-zhc67" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.844366 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-scripts\") pod \"keystone-bootstrap-zhc67\" (UID: \"df84fed8-d899-47ed-a702-2fbae2f75d53\") " pod="openstack/keystone-bootstrap-zhc67" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.844420 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-config-data\") pod \"keystone-bootstrap-zhc67\" (UID: \"df84fed8-d899-47ed-a702-2fbae2f75d53\") " pod="openstack/keystone-bootstrap-zhc67" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.844449 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-credential-keys\") pod \"keystone-bootstrap-zhc67\" (UID: \"df84fed8-d899-47ed-a702-2fbae2f75d53\") " pod="openstack/keystone-bootstrap-zhc67" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.844587 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-fernet-keys\") pod \"keystone-bootstrap-zhc67\" (UID: \"df84fed8-d899-47ed-a702-2fbae2f75d53\") " pod="openstack/keystone-bootstrap-zhc67" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.844657 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6sht\" (UniqueName: \"kubernetes.io/projected/df84fed8-d899-47ed-a702-2fbae2f75d53-kube-api-access-m6sht\") pod \"keystone-bootstrap-zhc67\" (UID: \"df84fed8-d899-47ed-a702-2fbae2f75d53\") " pod="openstack/keystone-bootstrap-zhc67" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.946988 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-scripts\") pod \"keystone-bootstrap-zhc67\" (UID: \"df84fed8-d899-47ed-a702-2fbae2f75d53\") " pod="openstack/keystone-bootstrap-zhc67" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.947082 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-config-data\") pod \"keystone-bootstrap-zhc67\" (UID: \"df84fed8-d899-47ed-a702-2fbae2f75d53\") " pod="openstack/keystone-bootstrap-zhc67" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.947108 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-credential-keys\") pod \"keystone-bootstrap-zhc67\" (UID: \"df84fed8-d899-47ed-a702-2fbae2f75d53\") " pod="openstack/keystone-bootstrap-zhc67" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.947151 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-fernet-keys\") pod \"keystone-bootstrap-zhc67\" (UID: \"df84fed8-d899-47ed-a702-2fbae2f75d53\") " pod="openstack/keystone-bootstrap-zhc67" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.947219 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6sht\" (UniqueName: \"kubernetes.io/projected/df84fed8-d899-47ed-a702-2fbae2f75d53-kube-api-access-m6sht\") pod \"keystone-bootstrap-zhc67\" (UID: \"df84fed8-d899-47ed-a702-2fbae2f75d53\") " pod="openstack/keystone-bootstrap-zhc67" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.947349 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-combined-ca-bundle\") pod \"keystone-bootstrap-zhc67\" (UID: \"df84fed8-d899-47ed-a702-2fbae2f75d53\") " pod="openstack/keystone-bootstrap-zhc67" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.952577 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-combined-ca-bundle\") pod \"keystone-bootstrap-zhc67\" (UID: \"df84fed8-d899-47ed-a702-2fbae2f75d53\") " pod="openstack/keystone-bootstrap-zhc67" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.952944 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-credential-keys\") pod \"keystone-bootstrap-zhc67\" (UID: \"df84fed8-d899-47ed-a702-2fbae2f75d53\") " pod="openstack/keystone-bootstrap-zhc67" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.953677 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-scripts\") pod \"keystone-bootstrap-zhc67\" (UID: \"df84fed8-d899-47ed-a702-2fbae2f75d53\") " pod="openstack/keystone-bootstrap-zhc67" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.964809 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-fernet-keys\") pod \"keystone-bootstrap-zhc67\" (UID: \"df84fed8-d899-47ed-a702-2fbae2f75d53\") " pod="openstack/keystone-bootstrap-zhc67" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.967479 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-config-data\") pod \"keystone-bootstrap-zhc67\" (UID: \"df84fed8-d899-47ed-a702-2fbae2f75d53\") " pod="openstack/keystone-bootstrap-zhc67" Dec 05 23:38:19 crc kubenswrapper[4734]: I1205 23:38:19.970215 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6sht\" (UniqueName: \"kubernetes.io/projected/df84fed8-d899-47ed-a702-2fbae2f75d53-kube-api-access-m6sht\") pod \"keystone-bootstrap-zhc67\" (UID: \"df84fed8-d899-47ed-a702-2fbae2f75d53\") " pod="openstack/keystone-bootstrap-zhc67" Dec 05 23:38:20 crc kubenswrapper[4734]: I1205 23:38:20.106334 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zhc67" Dec 05 23:38:20 crc kubenswrapper[4734]: E1205 23:38:20.594164 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 05 23:38:20 crc kubenswrapper[4734]: E1205 23:38:20.594480 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n599h5b5h65h4hbbh55dh7dh59chd8h5f8h56h9fh7bh5b4h74h595h68bh57bh55bh675h568h56dhd8h5d8h5f5h646h594h5d5h546hc5hd9hfdq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zpdkc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7ddfc99ccc-pn7dw_openstack(904ca7a9-2b5c-4636-bfeb-f98ea4981e2f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 23:38:20 crc kubenswrapper[4734]: E1205 23:38:20.599439 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7ddfc99ccc-pn7dw" podUID="904ca7a9-2b5c-4636-bfeb-f98ea4981e2f" Dec 05 23:38:20 crc kubenswrapper[4734]: E1205 23:38:20.897780 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Dec 05 23:38:20 crc kubenswrapper[4734]: E1205 23:38:20.898070 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n88h5fhbch679h57h667hb6h56bh567h5bch557h64fh59ch54dh5bdh58h557h548h545h5bch585h68dh547h9h5dbh5cdhf7h7bh54h7h686h66cq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qs8cz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(067f11aa-41d5-4a34-9f2e-33b35981e9ba): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 23:38:20 crc kubenswrapper[4734]: E1205 23:38:20.916160 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 05 23:38:20 crc kubenswrapper[4734]: E1205 23:38:20.916871 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5ch67dh55h64bhfh646hfh679h698h94h557h697h5cfh5c5h68dh9dhd5h8hbbh6dh67dhd5h667h9h5bbhc7h7ch7fhcfh58dh598h5d7q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4f9nc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-769b7bcdf7-h7m8l_openstack(b9dc8ed3-eab2-486b-9001-8ced7ac9ac22): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 23:38:20 crc kubenswrapper[4734]: E1205 23:38:20.919083 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-769b7bcdf7-h7m8l" podUID="b9dc8ed3-eab2-486b-9001-8ced7ac9ac22" Dec 05 23:38:21 crc kubenswrapper[4734]: I1205 23:38:21.354613 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-l7vfg" podUID="e0c9ffcb-625f-49f8-869a-e71e5f53b92b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.116:5353: connect: connection refused" Dec 05 23:38:21 crc kubenswrapper[4734]: I1205 23:38:21.626417 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9db3688-e0c9-423a-9c15-406e359fec75" path="/var/lib/kubelet/pods/d9db3688-e0c9-423a-9c15-406e359fec75/volumes" Dec 05 23:38:22 crc kubenswrapper[4734]: E1205 23:38:22.596717 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Dec 05 23:38:22 crc kubenswrapper[4734]: E1205 23:38:22.597407 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xqtfj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-6x54r_openstack(e97c0b2c-1294-43eb-a424-5c04e198611e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 23:38:22 crc kubenswrapper[4734]: E1205 23:38:22.598614 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-6x54r" podUID="e97c0b2c-1294-43eb-a424-5c04e198611e" Dec 05 23:38:22 crc kubenswrapper[4734]: I1205 23:38:22.684485 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wzn74" Dec 05 23:38:22 crc kubenswrapper[4734]: I1205 23:38:22.817708 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/741e9328-bc42-4fae-b3dd-316f3286fa42-config-data\") pod \"741e9328-bc42-4fae-b3dd-316f3286fa42\" (UID: \"741e9328-bc42-4fae-b3dd-316f3286fa42\") " Dec 05 23:38:22 crc kubenswrapper[4734]: I1205 23:38:22.818043 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtncl\" (UniqueName: \"kubernetes.io/projected/741e9328-bc42-4fae-b3dd-316f3286fa42-kube-api-access-wtncl\") pod \"741e9328-bc42-4fae-b3dd-316f3286fa42\" (UID: \"741e9328-bc42-4fae-b3dd-316f3286fa42\") " Dec 05 23:38:22 crc kubenswrapper[4734]: I1205 23:38:22.818108 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/741e9328-bc42-4fae-b3dd-316f3286fa42-db-sync-config-data\") pod \"741e9328-bc42-4fae-b3dd-316f3286fa42\" (UID: \"741e9328-bc42-4fae-b3dd-316f3286fa42\") " Dec 05 23:38:22 crc kubenswrapper[4734]: I1205 23:38:22.818374 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/741e9328-bc42-4fae-b3dd-316f3286fa42-combined-ca-bundle\") pod \"741e9328-bc42-4fae-b3dd-316f3286fa42\" (UID: \"741e9328-bc42-4fae-b3dd-316f3286fa42\") " Dec 05 23:38:22 crc kubenswrapper[4734]: I1205 23:38:22.827322 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/741e9328-bc42-4fae-b3dd-316f3286fa42-kube-api-access-wtncl" (OuterVolumeSpecName: "kube-api-access-wtncl") pod "741e9328-bc42-4fae-b3dd-316f3286fa42" (UID: "741e9328-bc42-4fae-b3dd-316f3286fa42"). InnerVolumeSpecName "kube-api-access-wtncl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:38:22 crc kubenswrapper[4734]: I1205 23:38:22.827955 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/741e9328-bc42-4fae-b3dd-316f3286fa42-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "741e9328-bc42-4fae-b3dd-316f3286fa42" (UID: "741e9328-bc42-4fae-b3dd-316f3286fa42"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:38:22 crc kubenswrapper[4734]: I1205 23:38:22.883324 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/741e9328-bc42-4fae-b3dd-316f3286fa42-config-data" (OuterVolumeSpecName: "config-data") pod "741e9328-bc42-4fae-b3dd-316f3286fa42" (UID: "741e9328-bc42-4fae-b3dd-316f3286fa42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:38:22 crc kubenswrapper[4734]: I1205 23:38:22.905158 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/741e9328-bc42-4fae-b3dd-316f3286fa42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "741e9328-bc42-4fae-b3dd-316f3286fa42" (UID: "741e9328-bc42-4fae-b3dd-316f3286fa42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:38:22 crc kubenswrapper[4734]: I1205 23:38:22.921190 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/741e9328-bc42-4fae-b3dd-316f3286fa42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:22 crc kubenswrapper[4734]: I1205 23:38:22.921231 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/741e9328-bc42-4fae-b3dd-316f3286fa42-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:22 crc kubenswrapper[4734]: I1205 23:38:22.921245 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtncl\" (UniqueName: \"kubernetes.io/projected/741e9328-bc42-4fae-b3dd-316f3286fa42-kube-api-access-wtncl\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:22 crc kubenswrapper[4734]: I1205 23:38:22.921312 4734 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/741e9328-bc42-4fae-b3dd-316f3286fa42-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:23 crc kubenswrapper[4734]: I1205 23:38:23.201865 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wzn74" Dec 05 23:38:23 crc kubenswrapper[4734]: I1205 23:38:23.208024 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wzn74" event={"ID":"741e9328-bc42-4fae-b3dd-316f3286fa42","Type":"ContainerDied","Data":"84e71b77f1960bcfad7eadc60bc13169af152e4b033dc448bdc6009bfeb04856"} Dec 05 23:38:23 crc kubenswrapper[4734]: I1205 23:38:23.208102 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84e71b77f1960bcfad7eadc60bc13169af152e4b033dc448bdc6009bfeb04856" Dec 05 23:38:23 crc kubenswrapper[4734]: E1205 23:38:23.209702 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-6x54r" podUID="e97c0b2c-1294-43eb-a424-5c04e198611e" Dec 05 23:38:24 crc kubenswrapper[4734]: I1205 23:38:24.193942 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-s4pbw"] Dec 05 23:38:24 crc kubenswrapper[4734]: E1205 23:38:24.197146 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="741e9328-bc42-4fae-b3dd-316f3286fa42" containerName="glance-db-sync" Dec 05 23:38:24 crc kubenswrapper[4734]: I1205 23:38:24.197232 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="741e9328-bc42-4fae-b3dd-316f3286fa42" containerName="glance-db-sync" Dec 05 23:38:24 crc kubenswrapper[4734]: I1205 23:38:24.197567 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="741e9328-bc42-4fae-b3dd-316f3286fa42" containerName="glance-db-sync" Dec 05 23:38:24 crc kubenswrapper[4734]: I1205 23:38:24.201610 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" Dec 05 23:38:24 crc kubenswrapper[4734]: I1205 23:38:24.206890 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-s4pbw"] Dec 05 23:38:24 crc kubenswrapper[4734]: I1205 23:38:24.255759 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx688\" (UniqueName: \"kubernetes.io/projected/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-kube-api-access-cx688\") pod \"dnsmasq-dns-785d8bcb8c-s4pbw\" (UID: \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" Dec 05 23:38:24 crc kubenswrapper[4734]: I1205 23:38:24.255845 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-s4pbw\" (UID: \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" Dec 05 23:38:24 crc kubenswrapper[4734]: I1205 23:38:24.255942 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-config\") pod \"dnsmasq-dns-785d8bcb8c-s4pbw\" (UID: \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" Dec 05 23:38:24 crc kubenswrapper[4734]: I1205 23:38:24.255996 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-s4pbw\" (UID: \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" Dec 05 23:38:24 crc kubenswrapper[4734]: I1205 23:38:24.256041 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-s4pbw\" (UID: \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" Dec 05 23:38:24 crc kubenswrapper[4734]: I1205 23:38:24.256088 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-s4pbw\" (UID: \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" Dec 05 23:38:24 crc kubenswrapper[4734]: I1205 23:38:24.358073 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-config\") pod \"dnsmasq-dns-785d8bcb8c-s4pbw\" (UID: \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" Dec 05 23:38:24 crc kubenswrapper[4734]: I1205 23:38:24.358154 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-s4pbw\" (UID: \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" Dec 05 23:38:24 crc kubenswrapper[4734]: I1205 23:38:24.358191 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-s4pbw\" (UID: \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" Dec 05 23:38:24 crc kubenswrapper[4734]: I1205 23:38:24.358216 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-s4pbw\" (UID: \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" Dec 05 23:38:24 crc kubenswrapper[4734]: I1205 23:38:24.358286 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx688\" (UniqueName: \"kubernetes.io/projected/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-kube-api-access-cx688\") pod \"dnsmasq-dns-785d8bcb8c-s4pbw\" (UID: \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" Dec 05 23:38:24 crc kubenswrapper[4734]: I1205 23:38:24.358311 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-s4pbw\" (UID: \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" Dec 05 23:38:24 crc kubenswrapper[4734]: I1205 23:38:24.359461 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-config\") pod \"dnsmasq-dns-785d8bcb8c-s4pbw\" (UID: \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" Dec 05 23:38:24 crc kubenswrapper[4734]: I1205 23:38:24.360119 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-s4pbw\" (UID: \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" Dec 05 23:38:24 crc kubenswrapper[4734]: I1205 23:38:24.362234 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-s4pbw\" (UID: \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" Dec 05 23:38:24 crc kubenswrapper[4734]: I1205 23:38:24.362314 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-s4pbw\" (UID: \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" Dec 05 23:38:24 crc kubenswrapper[4734]: I1205 23:38:24.362643 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-s4pbw\" (UID: \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" Dec 05 23:38:24 crc kubenswrapper[4734]: I1205 23:38:24.379704 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx688\" (UniqueName: \"kubernetes.io/projected/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-kube-api-access-cx688\") pod \"dnsmasq-dns-785d8bcb8c-s4pbw\" (UID: \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" Dec 05 23:38:24 crc kubenswrapper[4734]: I1205 23:38:24.529018 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.060893 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.063118 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.066138 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.066511 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-m6gpz" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.069779 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.073281 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.175140 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-scripts\") pod \"glance-default-external-api-0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.175200 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.175307 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-config-data\") pod \"glance-default-external-api-0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.175350 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.175385 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.175402 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-logs\") pod \"glance-default-external-api-0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.175426 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhfr4\" (UniqueName: \"kubernetes.io/projected/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-kube-api-access-bhfr4\") pod \"glance-default-external-api-0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.277159 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-scripts\") pod \"glance-default-external-api-0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.277205 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.277304 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-config-data\") pod \"glance-default-external-api-0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.278196 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.278240 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.278265 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-logs\") pod \"glance-default-external-api-0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.278287 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhfr4\" (UniqueName: \"kubernetes.io/projected/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-kube-api-access-bhfr4\") pod \"glance-default-external-api-0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.279186 4734 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.281712 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.282959 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-scripts\") pod \"glance-default-external-api-0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.283615 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.285171 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-config-data\") pod \"glance-default-external-api-0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.291877 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-logs\") pod \"glance-default-external-api-0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.299801 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhfr4\" (UniqueName: \"kubernetes.io/projected/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-kube-api-access-bhfr4\") pod \"glance-default-external-api-0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.308301 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.401495 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.402568 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.404206 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.407800 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.419838 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.483177 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21a9837d-cc1d-4bf1-9b52-f9196880e367-config-data\") pod \"glance-default-internal-api-0\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.483296 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/21a9837d-cc1d-4bf1-9b52-f9196880e367-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.483345 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56r2c\" (UniqueName: \"kubernetes.io/projected/21a9837d-cc1d-4bf1-9b52-f9196880e367-kube-api-access-56r2c\") pod \"glance-default-internal-api-0\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.483432 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21a9837d-cc1d-4bf1-9b52-f9196880e367-scripts\") pod \"glance-default-internal-api-0\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.483460 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21a9837d-cc1d-4bf1-9b52-f9196880e367-logs\") pod \"glance-default-internal-api-0\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.483746 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a9837d-cc1d-4bf1-9b52-f9196880e367-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.483876 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.586825 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21a9837d-cc1d-4bf1-9b52-f9196880e367-scripts\") pod \"glance-default-internal-api-0\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.587336 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21a9837d-cc1d-4bf1-9b52-f9196880e367-logs\") pod \"glance-default-internal-api-0\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.587390 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a9837d-cc1d-4bf1-9b52-f9196880e367-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.587438 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.587485 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21a9837d-cc1d-4bf1-9b52-f9196880e367-config-data\") pod \"glance-default-internal-api-0\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.587545 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/21a9837d-cc1d-4bf1-9b52-f9196880e367-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.587590 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56r2c\" (UniqueName: \"kubernetes.io/projected/21a9837d-cc1d-4bf1-9b52-f9196880e367-kube-api-access-56r2c\") pod \"glance-default-internal-api-0\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.588187 4734 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.588414 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/21a9837d-cc1d-4bf1-9b52-f9196880e367-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.588598 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21a9837d-cc1d-4bf1-9b52-f9196880e367-logs\") pod \"glance-default-internal-api-0\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.593032 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21a9837d-cc1d-4bf1-9b52-f9196880e367-scripts\") pod \"glance-default-internal-api-0\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.594516 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a9837d-cc1d-4bf1-9b52-f9196880e367-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.594694 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21a9837d-cc1d-4bf1-9b52-f9196880e367-config-data\") pod \"glance-default-internal-api-0\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.606066 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56r2c\" (UniqueName: \"kubernetes.io/projected/21a9837d-cc1d-4bf1-9b52-f9196880e367-kube-api-access-56r2c\") pod \"glance-default-internal-api-0\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.618071 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:25 crc kubenswrapper[4734]: I1205 23:38:25.789955 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 23:38:27 crc kubenswrapper[4734]: I1205 23:38:27.647499 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 23:38:27 crc kubenswrapper[4734]: I1205 23:38:27.730110 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 23:38:31 crc kubenswrapper[4734]: I1205 23:38:31.355551 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-l7vfg" podUID="e0c9ffcb-625f-49f8-869a-e71e5f53b92b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.116:5353: i/o timeout" Dec 05 23:38:31 crc kubenswrapper[4734]: I1205 23:38:31.357890 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-l7vfg" Dec 05 23:38:33 crc kubenswrapper[4734]: E1205 23:38:33.330360 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 05 23:38:33 crc kubenswrapper[4734]: E1205 23:38:33.330953 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pv9bp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-f82cb_openstack(d71f9558-c417-4cc7-934f-258f388cced2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 23:38:33 crc kubenswrapper[4734]: E1205 23:38:33.332332 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-f82cb" podUID="d71f9558-c417-4cc7-934f-258f388cced2" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.457460 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7ddfc99ccc-pn7dw" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.469717 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-769b7bcdf7-h7m8l" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.485997 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-l7vfg" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.594645 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-logs\") pod \"b9dc8ed3-eab2-486b-9001-8ced7ac9ac22\" (UID: \"b9dc8ed3-eab2-486b-9001-8ced7ac9ac22\") " Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.594723 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-scripts\") pod \"b9dc8ed3-eab2-486b-9001-8ced7ac9ac22\" (UID: \"b9dc8ed3-eab2-486b-9001-8ced7ac9ac22\") " Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.594759 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-config-data\") pod \"b9dc8ed3-eab2-486b-9001-8ced7ac9ac22\" (UID: \"b9dc8ed3-eab2-486b-9001-8ced7ac9ac22\") " Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.594804 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-logs\") pod \"904ca7a9-2b5c-4636-bfeb-f98ea4981e2f\" (UID: \"904ca7a9-2b5c-4636-bfeb-f98ea4981e2f\") " Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.594849 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-horizon-secret-key\") pod \"b9dc8ed3-eab2-486b-9001-8ced7ac9ac22\" (UID: \"b9dc8ed3-eab2-486b-9001-8ced7ac9ac22\") " Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.594929 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-config\") pod \"e0c9ffcb-625f-49f8-869a-e71e5f53b92b\" (UID: \"e0c9ffcb-625f-49f8-869a-e71e5f53b92b\") " Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.595039 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f9nc\" (UniqueName: \"kubernetes.io/projected/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-kube-api-access-4f9nc\") pod \"b9dc8ed3-eab2-486b-9001-8ced7ac9ac22\" (UID: \"b9dc8ed3-eab2-486b-9001-8ced7ac9ac22\") " Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.595114 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-ovsdbserver-nb\") pod \"e0c9ffcb-625f-49f8-869a-e71e5f53b92b\" (UID: \"e0c9ffcb-625f-49f8-869a-e71e5f53b92b\") " Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.595168 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-horizon-secret-key\") pod \"904ca7a9-2b5c-4636-bfeb-f98ea4981e2f\" (UID: \"904ca7a9-2b5c-4636-bfeb-f98ea4981e2f\") " Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.595212 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-config-data\") pod \"904ca7a9-2b5c-4636-bfeb-f98ea4981e2f\" (UID: \"904ca7a9-2b5c-4636-bfeb-f98ea4981e2f\") " Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.595277 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-ovsdbserver-sb\") pod \"e0c9ffcb-625f-49f8-869a-e71e5f53b92b\" (UID: \"e0c9ffcb-625f-49f8-869a-e71e5f53b92b\") " Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.595323 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-dns-svc\") pod \"e0c9ffcb-625f-49f8-869a-e71e5f53b92b\" (UID: \"e0c9ffcb-625f-49f8-869a-e71e5f53b92b\") " Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.595351 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-scripts\") pod \"904ca7a9-2b5c-4636-bfeb-f98ea4981e2f\" (UID: \"904ca7a9-2b5c-4636-bfeb-f98ea4981e2f\") " Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.595405 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpdkc\" (UniqueName: \"kubernetes.io/projected/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-kube-api-access-zpdkc\") pod \"904ca7a9-2b5c-4636-bfeb-f98ea4981e2f\" (UID: \"904ca7a9-2b5c-4636-bfeb-f98ea4981e2f\") " Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.595470 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c79lm\" (UniqueName: \"kubernetes.io/projected/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-kube-api-access-c79lm\") pod \"e0c9ffcb-625f-49f8-869a-e71e5f53b92b\" (UID: \"e0c9ffcb-625f-49f8-869a-e71e5f53b92b\") " Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.595716 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-config-data" (OuterVolumeSpecName: "config-data") pod "b9dc8ed3-eab2-486b-9001-8ced7ac9ac22" (UID: "b9dc8ed3-eab2-486b-9001-8ced7ac9ac22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.596085 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-logs" (OuterVolumeSpecName: "logs") pod "b9dc8ed3-eab2-486b-9001-8ced7ac9ac22" (UID: "b9dc8ed3-eab2-486b-9001-8ced7ac9ac22"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.596166 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.596263 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-logs" (OuterVolumeSpecName: "logs") pod "904ca7a9-2b5c-4636-bfeb-f98ea4981e2f" (UID: "904ca7a9-2b5c-4636-bfeb-f98ea4981e2f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.596398 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-scripts" (OuterVolumeSpecName: "scripts") pod "b9dc8ed3-eab2-486b-9001-8ced7ac9ac22" (UID: "b9dc8ed3-eab2-486b-9001-8ced7ac9ac22"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.596484 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-scripts" (OuterVolumeSpecName: "scripts") pod "904ca7a9-2b5c-4636-bfeb-f98ea4981e2f" (UID: "904ca7a9-2b5c-4636-bfeb-f98ea4981e2f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.596837 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-config-data" (OuterVolumeSpecName: "config-data") pod "904ca7a9-2b5c-4636-bfeb-f98ea4981e2f" (UID: "904ca7a9-2b5c-4636-bfeb-f98ea4981e2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.605770 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b9dc8ed3-eab2-486b-9001-8ced7ac9ac22" (UID: "b9dc8ed3-eab2-486b-9001-8ced7ac9ac22"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.605817 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-kube-api-access-c79lm" (OuterVolumeSpecName: "kube-api-access-c79lm") pod "e0c9ffcb-625f-49f8-869a-e71e5f53b92b" (UID: "e0c9ffcb-625f-49f8-869a-e71e5f53b92b"). InnerVolumeSpecName "kube-api-access-c79lm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.605877 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-kube-api-access-4f9nc" (OuterVolumeSpecName: "kube-api-access-4f9nc") pod "b9dc8ed3-eab2-486b-9001-8ced7ac9ac22" (UID: "b9dc8ed3-eab2-486b-9001-8ced7ac9ac22"). InnerVolumeSpecName "kube-api-access-4f9nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.606199 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "904ca7a9-2b5c-4636-bfeb-f98ea4981e2f" (UID: "904ca7a9-2b5c-4636-bfeb-f98ea4981e2f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.606812 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-kube-api-access-zpdkc" (OuterVolumeSpecName: "kube-api-access-zpdkc") pod "904ca7a9-2b5c-4636-bfeb-f98ea4981e2f" (UID: "904ca7a9-2b5c-4636-bfeb-f98ea4981e2f"). InnerVolumeSpecName "kube-api-access-zpdkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.655780 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e0c9ffcb-625f-49f8-869a-e71e5f53b92b" (UID: "e0c9ffcb-625f-49f8-869a-e71e5f53b92b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.670110 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-config" (OuterVolumeSpecName: "config") pod "e0c9ffcb-625f-49f8-869a-e71e5f53b92b" (UID: "e0c9ffcb-625f-49f8-869a-e71e5f53b92b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.677270 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e0c9ffcb-625f-49f8-869a-e71e5f53b92b" (UID: "e0c9ffcb-625f-49f8-869a-e71e5f53b92b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.698460 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.698518 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f9nc\" (UniqueName: \"kubernetes.io/projected/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-kube-api-access-4f9nc\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.698549 4734 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.698560 4734 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.698571 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.698580 4734 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.698589 4734 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.698599 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpdkc\" (UniqueName: \"kubernetes.io/projected/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-kube-api-access-zpdkc\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.698609 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c79lm\" (UniqueName: \"kubernetes.io/projected/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-kube-api-access-c79lm\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.698618 4734 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-logs\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.698626 4734 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.698637 4734 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f-logs\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.698646 4734 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.701508 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e0c9ffcb-625f-49f8-869a-e71e5f53b92b" (UID: "e0c9ffcb-625f-49f8-869a-e71e5f53b92b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:38:33 crc kubenswrapper[4734]: I1205 23:38:33.801175 4734 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0c9ffcb-625f-49f8-869a-e71e5f53b92b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:34 crc kubenswrapper[4734]: I1205 23:38:34.339071 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-l7vfg" event={"ID":"e0c9ffcb-625f-49f8-869a-e71e5f53b92b","Type":"ContainerDied","Data":"1b5b286881ff51d2dbaf4ca3a824de9e524259ccaba1c9f7ac8244cffcb3345b"} Dec 05 23:38:34 crc kubenswrapper[4734]: I1205 23:38:34.339123 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-l7vfg" Dec 05 23:38:34 crc kubenswrapper[4734]: I1205 23:38:34.339169 4734 scope.go:117] "RemoveContainer" containerID="d3435168d1c4d378de9e64845e024fdbc2764b4dbac12d6383881b6a4ac18b30" Dec 05 23:38:34 crc kubenswrapper[4734]: I1205 23:38:34.340838 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-769b7bcdf7-h7m8l" event={"ID":"b9dc8ed3-eab2-486b-9001-8ced7ac9ac22","Type":"ContainerDied","Data":"3f81596039c8c3343db4a09330899db36e227b7a2f708e30745eccb0f18d88eb"} Dec 05 23:38:34 crc kubenswrapper[4734]: I1205 23:38:34.340860 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-769b7bcdf7-h7m8l" Dec 05 23:38:34 crc kubenswrapper[4734]: I1205 23:38:34.350199 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7ddfc99ccc-pn7dw" event={"ID":"904ca7a9-2b5c-4636-bfeb-f98ea4981e2f","Type":"ContainerDied","Data":"f76ee2e0d5b2f58fe7a2de45df3803f0bb3bc6b0d9f9a3277ad4c6c089cbef58"} Dec 05 23:38:34 crc kubenswrapper[4734]: I1205 23:38:34.350199 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7ddfc99ccc-pn7dw" Dec 05 23:38:34 crc kubenswrapper[4734]: E1205 23:38:34.352569 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-f82cb" podUID="d71f9558-c417-4cc7-934f-258f388cced2" Dec 05 23:38:34 crc kubenswrapper[4734]: I1205 23:38:34.425021 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-769b7bcdf7-h7m8l"] Dec 05 23:38:34 crc kubenswrapper[4734]: I1205 23:38:34.435992 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-769b7bcdf7-h7m8l"] Dec 05 23:38:34 crc kubenswrapper[4734]: I1205 23:38:34.448561 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-l7vfg"] Dec 05 23:38:34 crc kubenswrapper[4734]: I1205 23:38:34.480734 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-l7vfg"] Dec 05 23:38:34 crc kubenswrapper[4734]: I1205 23:38:34.480849 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7ddfc99ccc-pn7dw"] Dec 05 23:38:34 crc kubenswrapper[4734]: I1205 23:38:34.509719 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7ddfc99ccc-pn7dw"] Dec 05 23:38:34 crc kubenswrapper[4734]: E1205 23:38:34.813639 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 05 23:38:34 crc kubenswrapper[4734]: E1205 23:38:34.813829 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pf9fs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-xsvx9_openstack(4c26d17f-e341-41c5-9759-c0b265fcceea): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 23:38:34 crc kubenswrapper[4734]: E1205 23:38:34.815039 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-xsvx9" podUID="4c26d17f-e341-41c5-9759-c0b265fcceea" Dec 05 23:38:35 crc kubenswrapper[4734]: E1205 23:38:35.170451 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified" Dec 05 23:38:35 crc kubenswrapper[4734]: E1205 23:38:35.171252 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-notification-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n88h5fhbch679h57h667hb6h56bh567h5bch557h64fh59ch54dh5bdh58h557h548h545h5bch585h68dh547h9h5dbh5cdhf7h7bh54h7h686h66cq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-notification-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qs8cz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/notificationhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(067f11aa-41d5-4a34-9f2e-33b35981e9ba): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 23:38:35 crc kubenswrapper[4734]: I1205 23:38:35.213758 4734 scope.go:117] "RemoveContainer" containerID="16e08f8a60eef0b7a460ab170fd2f7f8d972e23768931e83cf026c83257636ab" Dec 05 23:38:35 crc kubenswrapper[4734]: I1205 23:38:35.279076 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d469948dd-n7t4x"] Dec 05 23:38:36 crc kubenswrapper[4734]: E1205 23:38:35.448194 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-xsvx9" podUID="4c26d17f-e341-41c5-9759-c0b265fcceea" Dec 05 23:38:36 crc kubenswrapper[4734]: I1205 23:38:35.628303 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="904ca7a9-2b5c-4636-bfeb-f98ea4981e2f" path="/var/lib/kubelet/pods/904ca7a9-2b5c-4636-bfeb-f98ea4981e2f/volumes" Dec 05 23:38:36 crc kubenswrapper[4734]: I1205 23:38:35.631991 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9dc8ed3-eab2-486b-9001-8ced7ac9ac22" path="/var/lib/kubelet/pods/b9dc8ed3-eab2-486b-9001-8ced7ac9ac22/volumes" Dec 05 23:38:36 crc kubenswrapper[4734]: I1205 23:38:35.632598 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0c9ffcb-625f-49f8-869a-e71e5f53b92b" path="/var/lib/kubelet/pods/e0c9ffcb-625f-49f8-869a-e71e5f53b92b/volumes" Dec 05 23:38:36 crc kubenswrapper[4734]: I1205 23:38:35.717234 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-755fc898d8-dlnbz"] Dec 05 23:38:36 crc kubenswrapper[4734]: W1205 23:38:35.726486 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbcbbde9_55c9_48dc_866d_ab670775e9b3.slice/crio-a68b6ab543cfd0907876bb2575b63712fa977d5dcc196c0bdfd692f855e43e69 WatchSource:0}: Error finding container a68b6ab543cfd0907876bb2575b63712fa977d5dcc196c0bdfd692f855e43e69: Status 404 returned error can't find the container with id a68b6ab543cfd0907876bb2575b63712fa977d5dcc196c0bdfd692f855e43e69 Dec 05 23:38:36 crc kubenswrapper[4734]: I1205 23:38:36.357728 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-l7vfg" podUID="e0c9ffcb-625f-49f8-869a-e71e5f53b92b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.116:5353: i/o timeout" Dec 05 23:38:36 crc kubenswrapper[4734]: I1205 23:38:36.421214 4734 generic.go:334] "Generic (PLEG): container finished" podID="6c427c6a-2e27-4e8d-9088-1cdad55da769" containerID="adea1b041e1d7c952fad653f11acbe0d9b04cc0b8533443a7ec38aa9b962d3bf" exitCode=0 Dec 05 23:38:36 crc kubenswrapper[4734]: I1205 23:38:36.421309 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-58fkh" event={"ID":"6c427c6a-2e27-4e8d-9088-1cdad55da769","Type":"ContainerDied","Data":"adea1b041e1d7c952fad653f11acbe0d9b04cc0b8533443a7ec38aa9b962d3bf"} Dec 05 23:38:36 crc kubenswrapper[4734]: I1205 23:38:36.439366 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c55dfd787-8xfc2" event={"ID":"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb","Type":"ContainerStarted","Data":"1e9bd1d7fbce33ca96e48e6f2079fdd966a72795b3f096b43fc9a34d5663856c"} Dec 05 23:38:36 crc kubenswrapper[4734]: I1205 23:38:36.439424 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c55dfd787-8xfc2" event={"ID":"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb","Type":"ContainerStarted","Data":"81aae8f5faee4191e15bf0466c0ce55e2137b33ca675df174b2ad82865c93da9"} Dec 05 23:38:36 crc kubenswrapper[4734]: I1205 23:38:36.439598 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-c55dfd787-8xfc2" podUID="5f9ab2cc-aaf2-46c4-b03b-c70d220732cb" containerName="horizon-log" containerID="cri-o://81aae8f5faee4191e15bf0466c0ce55e2137b33ca675df174b2ad82865c93da9" gracePeriod=30 Dec 05 23:38:36 crc kubenswrapper[4734]: I1205 23:38:36.439747 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-c55dfd787-8xfc2" podUID="5f9ab2cc-aaf2-46c4-b03b-c70d220732cb" containerName="horizon" containerID="cri-o://1e9bd1d7fbce33ca96e48e6f2079fdd966a72795b3f096b43fc9a34d5663856c" gracePeriod=30 Dec 05 23:38:36 crc kubenswrapper[4734]: I1205 23:38:36.459058 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d469948dd-n7t4x" event={"ID":"c96cd173-4707-4edc-a92e-35db297082e2","Type":"ContainerStarted","Data":"9514774e87490121e03a56ce448d088925b7977bc46c6a8cf0102e7f1954b7d8"} Dec 05 23:38:36 crc kubenswrapper[4734]: I1205 23:38:36.459131 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d469948dd-n7t4x" event={"ID":"c96cd173-4707-4edc-a92e-35db297082e2","Type":"ContainerStarted","Data":"8415f4e36ba22f55e16f34c1af4d5e303eef0e3ab5c7ab16635ff6313372655d"} Dec 05 23:38:36 crc kubenswrapper[4734]: I1205 23:38:36.459146 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d469948dd-n7t4x" event={"ID":"c96cd173-4707-4edc-a92e-35db297082e2","Type":"ContainerStarted","Data":"c7b6d2ab251d04302cb99b80cf288a7b1730acbf3c36683bc32889fd89ca00f9"} Dec 05 23:38:36 crc kubenswrapper[4734]: I1205 23:38:36.496760 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-755fc898d8-dlnbz" event={"ID":"bbcbbde9-55c9-48dc-866d-ab670775e9b3","Type":"ContainerStarted","Data":"279e22d6fec0db9c031b3fd9a7e26b85946aa6adb4acda18ee5e9c2a6c0a4044"} Dec 05 23:38:36 crc kubenswrapper[4734]: I1205 23:38:36.496837 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-755fc898d8-dlnbz" event={"ID":"bbcbbde9-55c9-48dc-866d-ab670775e9b3","Type":"ContainerStarted","Data":"a68b6ab543cfd0907876bb2575b63712fa977d5dcc196c0bdfd692f855e43e69"} Dec 05 23:38:36 crc kubenswrapper[4734]: I1205 23:38:36.499435 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-c55dfd787-8xfc2" podStartSLOduration=5.352919997 podStartE2EDuration="34.499419958s" podCreationTimestamp="2025-12-05 23:38:02 +0000 UTC" firstStartedPulling="2025-12-05 23:38:04.212609771 +0000 UTC m=+1104.896014047" lastFinishedPulling="2025-12-05 23:38:33.359109732 +0000 UTC m=+1134.042514008" observedRunningTime="2025-12-05 23:38:36.474890843 +0000 UTC m=+1137.158295139" watchObservedRunningTime="2025-12-05 23:38:36.499419958 +0000 UTC m=+1137.182824224" Dec 05 23:38:36 crc kubenswrapper[4734]: I1205 23:38:36.531232 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-s4pbw"] Dec 05 23:38:36 crc kubenswrapper[4734]: I1205 23:38:36.535644 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5d469948dd-n7t4x" podStartSLOduration=25.535613045 podStartE2EDuration="25.535613045s" podCreationTimestamp="2025-12-05 23:38:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:38:36.515265201 +0000 UTC m=+1137.198669477" watchObservedRunningTime="2025-12-05 23:38:36.535613045 +0000 UTC m=+1137.219017321" Dec 05 23:38:36 crc kubenswrapper[4734]: I1205 23:38:36.555962 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-zhc67"] Dec 05 23:38:36 crc kubenswrapper[4734]: W1205 23:38:36.562811 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf84fed8_d899_47ed_a702_2fbae2f75d53.slice/crio-057ebedd054cdb0c0a0560ae9ad0cceae01b9fe37a03c6f064623499f8f9c680 WatchSource:0}: Error finding container 057ebedd054cdb0c0a0560ae9ad0cceae01b9fe37a03c6f064623499f8f9c680: Status 404 returned error can't find the container with id 057ebedd054cdb0c0a0560ae9ad0cceae01b9fe37a03c6f064623499f8f9c680 Dec 05 23:38:36 crc kubenswrapper[4734]: I1205 23:38:36.815704 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 23:38:37 crc kubenswrapper[4734]: I1205 23:38:37.533177 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"21a9837d-cc1d-4bf1-9b52-f9196880e367","Type":"ContainerStarted","Data":"768dc09f91238fb7ab263092d730b45a2a7f8de3fa4b62d2cd1007fa03d2bf9d"} Dec 05 23:38:37 crc kubenswrapper[4734]: I1205 23:38:37.545491 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6x54r" event={"ID":"e97c0b2c-1294-43eb-a424-5c04e198611e","Type":"ContainerStarted","Data":"7f71bf11891772dbfb88764495aa12ed3fc0c9cfca8e941570c5f0658deb175b"} Dec 05 23:38:37 crc kubenswrapper[4734]: I1205 23:38:37.556927 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-755fc898d8-dlnbz" event={"ID":"bbcbbde9-55c9-48dc-866d-ab670775e9b3","Type":"ContainerStarted","Data":"5e27dbdf9dd92e78333b2736ebf811de16b24336624a01721d3a634e5e1831fa"} Dec 05 23:38:37 crc kubenswrapper[4734]: I1205 23:38:37.563602 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zhc67" event={"ID":"df84fed8-d899-47ed-a702-2fbae2f75d53","Type":"ContainerStarted","Data":"ecaea013bad616b1d169bf69540826a36097fec2f10acbe2671b8ebb2f430d19"} Dec 05 23:38:37 crc kubenswrapper[4734]: I1205 23:38:37.563666 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zhc67" event={"ID":"df84fed8-d899-47ed-a702-2fbae2f75d53","Type":"ContainerStarted","Data":"057ebedd054cdb0c0a0560ae9ad0cceae01b9fe37a03c6f064623499f8f9c680"} Dec 05 23:38:37 crc kubenswrapper[4734]: I1205 23:38:37.567663 4734 generic.go:334] "Generic (PLEG): container finished" podID="c0867534-4785-4d2c-9be9-74f3dfa9fb3c" containerID="e166bf2220bc5c55b26024811c8aa337944ca3c4a8166e0fbee4f218d22b925e" exitCode=0 Dec 05 23:38:37 crc kubenswrapper[4734]: I1205 23:38:37.567823 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" event={"ID":"c0867534-4785-4d2c-9be9-74f3dfa9fb3c","Type":"ContainerDied","Data":"e166bf2220bc5c55b26024811c8aa337944ca3c4a8166e0fbee4f218d22b925e"} Dec 05 23:38:37 crc kubenswrapper[4734]: I1205 23:38:37.567953 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" event={"ID":"c0867534-4785-4d2c-9be9-74f3dfa9fb3c","Type":"ContainerStarted","Data":"96520f07ec756c4b57510ed6017fbec9447c2404a0dc98819b6172f6771cc2db"} Dec 05 23:38:37 crc kubenswrapper[4734]: I1205 23:38:37.577624 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-6x54r" podStartSLOduration=2.666674694 podStartE2EDuration="35.577602871s" podCreationTimestamp="2025-12-05 23:38:02 +0000 UTC" firstStartedPulling="2025-12-05 23:38:04.17580696 +0000 UTC m=+1104.859211236" lastFinishedPulling="2025-12-05 23:38:37.086735137 +0000 UTC m=+1137.770139413" observedRunningTime="2025-12-05 23:38:37.571490632 +0000 UTC m=+1138.254894908" watchObservedRunningTime="2025-12-05 23:38:37.577602871 +0000 UTC m=+1138.261007147" Dec 05 23:38:37 crc kubenswrapper[4734]: I1205 23:38:37.671194 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-755fc898d8-dlnbz" podStartSLOduration=26.671171018 podStartE2EDuration="26.671171018s" podCreationTimestamp="2025-12-05 23:38:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:38:37.657297772 +0000 UTC m=+1138.340702048" watchObservedRunningTime="2025-12-05 23:38:37.671171018 +0000 UTC m=+1138.354575294" Dec 05 23:38:37 crc kubenswrapper[4734]: I1205 23:38:37.697791 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-zhc67" podStartSLOduration=18.697759952 podStartE2EDuration="18.697759952s" podCreationTimestamp="2025-12-05 23:38:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:38:37.690897606 +0000 UTC m=+1138.374301882" watchObservedRunningTime="2025-12-05 23:38:37.697759952 +0000 UTC m=+1138.381164248" Dec 05 23:38:37 crc kubenswrapper[4734]: I1205 23:38:37.720817 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 23:38:38 crc kubenswrapper[4734]: I1205 23:38:38.021742 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-58fkh" Dec 05 23:38:38 crc kubenswrapper[4734]: I1205 23:38:38.125717 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c427c6a-2e27-4e8d-9088-1cdad55da769-combined-ca-bundle\") pod \"6c427c6a-2e27-4e8d-9088-1cdad55da769\" (UID: \"6c427c6a-2e27-4e8d-9088-1cdad55da769\") " Dec 05 23:38:38 crc kubenswrapper[4734]: I1205 23:38:38.125808 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m54j6\" (UniqueName: \"kubernetes.io/projected/6c427c6a-2e27-4e8d-9088-1cdad55da769-kube-api-access-m54j6\") pod \"6c427c6a-2e27-4e8d-9088-1cdad55da769\" (UID: \"6c427c6a-2e27-4e8d-9088-1cdad55da769\") " Dec 05 23:38:38 crc kubenswrapper[4734]: I1205 23:38:38.125915 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6c427c6a-2e27-4e8d-9088-1cdad55da769-config\") pod \"6c427c6a-2e27-4e8d-9088-1cdad55da769\" (UID: \"6c427c6a-2e27-4e8d-9088-1cdad55da769\") " Dec 05 23:38:38 crc kubenswrapper[4734]: I1205 23:38:38.142890 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c427c6a-2e27-4e8d-9088-1cdad55da769-kube-api-access-m54j6" (OuterVolumeSpecName: "kube-api-access-m54j6") pod "6c427c6a-2e27-4e8d-9088-1cdad55da769" (UID: "6c427c6a-2e27-4e8d-9088-1cdad55da769"). InnerVolumeSpecName "kube-api-access-m54j6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:38:38 crc kubenswrapper[4734]: I1205 23:38:38.192715 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c427c6a-2e27-4e8d-9088-1cdad55da769-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c427c6a-2e27-4e8d-9088-1cdad55da769" (UID: "6c427c6a-2e27-4e8d-9088-1cdad55da769"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:38:38 crc kubenswrapper[4734]: I1205 23:38:38.206630 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c427c6a-2e27-4e8d-9088-1cdad55da769-config" (OuterVolumeSpecName: "config") pod "6c427c6a-2e27-4e8d-9088-1cdad55da769" (UID: "6c427c6a-2e27-4e8d-9088-1cdad55da769"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:38:38 crc kubenswrapper[4734]: I1205 23:38:38.228411 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6c427c6a-2e27-4e8d-9088-1cdad55da769-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:38 crc kubenswrapper[4734]: I1205 23:38:38.228458 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c427c6a-2e27-4e8d-9088-1cdad55da769-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:38 crc kubenswrapper[4734]: I1205 23:38:38.228470 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m54j6\" (UniqueName: \"kubernetes.io/projected/6c427c6a-2e27-4e8d-9088-1cdad55da769-kube-api-access-m54j6\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:38 crc kubenswrapper[4734]: I1205 23:38:38.644279 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"21a9837d-cc1d-4bf1-9b52-f9196880e367","Type":"ContainerStarted","Data":"e3a126dc953224f7cf69d43ba28034a6b6151fba211d245d8ef374996a56a5fa"} Dec 05 23:38:38 crc kubenswrapper[4734]: I1205 23:38:38.651401 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-58fkh" event={"ID":"6c427c6a-2e27-4e8d-9088-1cdad55da769","Type":"ContainerDied","Data":"231ea6db99fa0c7726b777738208814111587ad3583a8ae90bb8291d6897245e"} Dec 05 23:38:38 crc kubenswrapper[4734]: I1205 23:38:38.651469 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="231ea6db99fa0c7726b777738208814111587ad3583a8ae90bb8291d6897245e" Dec 05 23:38:38 crc kubenswrapper[4734]: I1205 23:38:38.651585 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-58fkh" Dec 05 23:38:38 crc kubenswrapper[4734]: I1205 23:38:38.659492 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bde2d02b-758d-49cc-ba28-9501bbc7d0b0","Type":"ContainerStarted","Data":"26fd39c3fc1619afa4e962e75d36bb1f470c64c22a3ea868e52ec6ebb8c3d440"} Dec 05 23:38:38 crc kubenswrapper[4734]: I1205 23:38:38.676367 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" Dec 05 23:38:38 crc kubenswrapper[4734]: I1205 23:38:38.676429 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" event={"ID":"c0867534-4785-4d2c-9be9-74f3dfa9fb3c","Type":"ContainerStarted","Data":"3a3da532ecc1f2bbd6753ee7f360b54117553464d2fcdd547f49e10b768038df"} Dec 05 23:38:38 crc kubenswrapper[4734]: I1205 23:38:38.706168 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" podStartSLOduration=14.706138493 podStartE2EDuration="14.706138493s" podCreationTimestamp="2025-12-05 23:38:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:38:38.697419292 +0000 UTC m=+1139.380823568" watchObservedRunningTime="2025-12-05 23:38:38.706138493 +0000 UTC m=+1139.389542779" Dec 05 23:38:38 crc kubenswrapper[4734]: I1205 23:38:38.861617 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-s4pbw"] Dec 05 23:38:38 crc kubenswrapper[4734]: I1205 23:38:38.910422 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-gxrch"] Dec 05 23:38:38 crc kubenswrapper[4734]: E1205 23:38:38.910914 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c427c6a-2e27-4e8d-9088-1cdad55da769" containerName="neutron-db-sync" Dec 05 23:38:38 crc kubenswrapper[4734]: I1205 23:38:38.910931 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c427c6a-2e27-4e8d-9088-1cdad55da769" containerName="neutron-db-sync" Dec 05 23:38:38 crc kubenswrapper[4734]: E1205 23:38:38.910939 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0c9ffcb-625f-49f8-869a-e71e5f53b92b" containerName="init" Dec 05 23:38:38 crc kubenswrapper[4734]: I1205 23:38:38.910945 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0c9ffcb-625f-49f8-869a-e71e5f53b92b" containerName="init" Dec 05 23:38:38 crc kubenswrapper[4734]: E1205 23:38:38.910970 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0c9ffcb-625f-49f8-869a-e71e5f53b92b" containerName="dnsmasq-dns" Dec 05 23:38:38 crc kubenswrapper[4734]: I1205 23:38:38.910976 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0c9ffcb-625f-49f8-869a-e71e5f53b92b" containerName="dnsmasq-dns" Dec 05 23:38:38 crc kubenswrapper[4734]: I1205 23:38:38.911161 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c427c6a-2e27-4e8d-9088-1cdad55da769" containerName="neutron-db-sync" Dec 05 23:38:38 crc kubenswrapper[4734]: I1205 23:38:38.911185 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0c9ffcb-625f-49f8-869a-e71e5f53b92b" containerName="dnsmasq-dns" Dec 05 23:38:38 crc kubenswrapper[4734]: I1205 23:38:38.912249 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-gxrch" Dec 05 23:38:38 crc kubenswrapper[4734]: I1205 23:38:38.950761 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-gxrch"] Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.032979 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-654798d8cb-lq2fv"] Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.034911 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-654798d8cb-lq2fv" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.041434 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.042012 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-pmh6z" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.042224 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.042415 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.050862 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-654798d8cb-lq2fv"] Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.106993 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhz68\" (UniqueName: \"kubernetes.io/projected/de3b13fb-9708-44af-bd09-f9be8514121e-kube-api-access-rhz68\") pod \"dnsmasq-dns-55f844cf75-gxrch\" (UID: \"de3b13fb-9708-44af-bd09-f9be8514121e\") " pod="openstack/dnsmasq-dns-55f844cf75-gxrch" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.107132 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-gxrch\" (UID: \"de3b13fb-9708-44af-bd09-f9be8514121e\") " pod="openstack/dnsmasq-dns-55f844cf75-gxrch" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.107202 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-gxrch\" (UID: \"de3b13fb-9708-44af-bd09-f9be8514121e\") " pod="openstack/dnsmasq-dns-55f844cf75-gxrch" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.107418 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-config\") pod \"dnsmasq-dns-55f844cf75-gxrch\" (UID: \"de3b13fb-9708-44af-bd09-f9be8514121e\") " pod="openstack/dnsmasq-dns-55f844cf75-gxrch" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.107487 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-dns-svc\") pod \"dnsmasq-dns-55f844cf75-gxrch\" (UID: \"de3b13fb-9708-44af-bd09-f9be8514121e\") " pod="openstack/dnsmasq-dns-55f844cf75-gxrch" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.107686 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-gxrch\" (UID: \"de3b13fb-9708-44af-bd09-f9be8514121e\") " pod="openstack/dnsmasq-dns-55f844cf75-gxrch" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.210373 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-gxrch\" (UID: \"de3b13fb-9708-44af-bd09-f9be8514121e\") " pod="openstack/dnsmasq-dns-55f844cf75-gxrch" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.210890 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-gxrch\" (UID: \"de3b13fb-9708-44af-bd09-f9be8514121e\") " pod="openstack/dnsmasq-dns-55f844cf75-gxrch" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.210972 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-config\") pod \"dnsmasq-dns-55f844cf75-gxrch\" (UID: \"de3b13fb-9708-44af-bd09-f9be8514121e\") " pod="openstack/dnsmasq-dns-55f844cf75-gxrch" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.211153 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-dns-svc\") pod \"dnsmasq-dns-55f844cf75-gxrch\" (UID: \"de3b13fb-9708-44af-bd09-f9be8514121e\") " pod="openstack/dnsmasq-dns-55f844cf75-gxrch" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.211187 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvhk8\" (UniqueName: \"kubernetes.io/projected/6a6a790b-5626-41aa-994f-0c0740790a7d-kube-api-access-kvhk8\") pod \"neutron-654798d8cb-lq2fv\" (UID: \"6a6a790b-5626-41aa-994f-0c0740790a7d\") " pod="openstack/neutron-654798d8cb-lq2fv" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.211208 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a6a790b-5626-41aa-994f-0c0740790a7d-combined-ca-bundle\") pod \"neutron-654798d8cb-lq2fv\" (UID: \"6a6a790b-5626-41aa-994f-0c0740790a7d\") " pod="openstack/neutron-654798d8cb-lq2fv" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.211229 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6a6a790b-5626-41aa-994f-0c0740790a7d-httpd-config\") pod \"neutron-654798d8cb-lq2fv\" (UID: \"6a6a790b-5626-41aa-994f-0c0740790a7d\") " pod="openstack/neutron-654798d8cb-lq2fv" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.211262 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6a6a790b-5626-41aa-994f-0c0740790a7d-config\") pod \"neutron-654798d8cb-lq2fv\" (UID: \"6a6a790b-5626-41aa-994f-0c0740790a7d\") " pod="openstack/neutron-654798d8cb-lq2fv" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.211306 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-gxrch\" (UID: \"de3b13fb-9708-44af-bd09-f9be8514121e\") " pod="openstack/dnsmasq-dns-55f844cf75-gxrch" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.211333 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhz68\" (UniqueName: \"kubernetes.io/projected/de3b13fb-9708-44af-bd09-f9be8514121e-kube-api-access-rhz68\") pod \"dnsmasq-dns-55f844cf75-gxrch\" (UID: \"de3b13fb-9708-44af-bd09-f9be8514121e\") " pod="openstack/dnsmasq-dns-55f844cf75-gxrch" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.211357 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a6a790b-5626-41aa-994f-0c0740790a7d-ovndb-tls-certs\") pod \"neutron-654798d8cb-lq2fv\" (UID: \"6a6a790b-5626-41aa-994f-0c0740790a7d\") " pod="openstack/neutron-654798d8cb-lq2fv" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.212037 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-gxrch\" (UID: \"de3b13fb-9708-44af-bd09-f9be8514121e\") " pod="openstack/dnsmasq-dns-55f844cf75-gxrch" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.212727 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-config\") pod \"dnsmasq-dns-55f844cf75-gxrch\" (UID: \"de3b13fb-9708-44af-bd09-f9be8514121e\") " pod="openstack/dnsmasq-dns-55f844cf75-gxrch" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.212918 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-dns-svc\") pod \"dnsmasq-dns-55f844cf75-gxrch\" (UID: \"de3b13fb-9708-44af-bd09-f9be8514121e\") " pod="openstack/dnsmasq-dns-55f844cf75-gxrch" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.213668 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-gxrch\" (UID: \"de3b13fb-9708-44af-bd09-f9be8514121e\") " pod="openstack/dnsmasq-dns-55f844cf75-gxrch" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.216303 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-gxrch\" (UID: \"de3b13fb-9708-44af-bd09-f9be8514121e\") " pod="openstack/dnsmasq-dns-55f844cf75-gxrch" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.233261 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhz68\" (UniqueName: \"kubernetes.io/projected/de3b13fb-9708-44af-bd09-f9be8514121e-kube-api-access-rhz68\") pod \"dnsmasq-dns-55f844cf75-gxrch\" (UID: \"de3b13fb-9708-44af-bd09-f9be8514121e\") " pod="openstack/dnsmasq-dns-55f844cf75-gxrch" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.240134 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-gxrch" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.312807 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a6a790b-5626-41aa-994f-0c0740790a7d-ovndb-tls-certs\") pod \"neutron-654798d8cb-lq2fv\" (UID: \"6a6a790b-5626-41aa-994f-0c0740790a7d\") " pod="openstack/neutron-654798d8cb-lq2fv" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.312942 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvhk8\" (UniqueName: \"kubernetes.io/projected/6a6a790b-5626-41aa-994f-0c0740790a7d-kube-api-access-kvhk8\") pod \"neutron-654798d8cb-lq2fv\" (UID: \"6a6a790b-5626-41aa-994f-0c0740790a7d\") " pod="openstack/neutron-654798d8cb-lq2fv" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.312965 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a6a790b-5626-41aa-994f-0c0740790a7d-combined-ca-bundle\") pod \"neutron-654798d8cb-lq2fv\" (UID: \"6a6a790b-5626-41aa-994f-0c0740790a7d\") " pod="openstack/neutron-654798d8cb-lq2fv" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.312984 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6a6a790b-5626-41aa-994f-0c0740790a7d-httpd-config\") pod \"neutron-654798d8cb-lq2fv\" (UID: \"6a6a790b-5626-41aa-994f-0c0740790a7d\") " pod="openstack/neutron-654798d8cb-lq2fv" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.313015 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6a6a790b-5626-41aa-994f-0c0740790a7d-config\") pod \"neutron-654798d8cb-lq2fv\" (UID: \"6a6a790b-5626-41aa-994f-0c0740790a7d\") " pod="openstack/neutron-654798d8cb-lq2fv" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.319135 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6a6a790b-5626-41aa-994f-0c0740790a7d-httpd-config\") pod \"neutron-654798d8cb-lq2fv\" (UID: \"6a6a790b-5626-41aa-994f-0c0740790a7d\") " pod="openstack/neutron-654798d8cb-lq2fv" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.327329 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6a6a790b-5626-41aa-994f-0c0740790a7d-config\") pod \"neutron-654798d8cb-lq2fv\" (UID: \"6a6a790b-5626-41aa-994f-0c0740790a7d\") " pod="openstack/neutron-654798d8cb-lq2fv" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.327638 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a6a790b-5626-41aa-994f-0c0740790a7d-ovndb-tls-certs\") pod \"neutron-654798d8cb-lq2fv\" (UID: \"6a6a790b-5626-41aa-994f-0c0740790a7d\") " pod="openstack/neutron-654798d8cb-lq2fv" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.329408 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a6a790b-5626-41aa-994f-0c0740790a7d-combined-ca-bundle\") pod \"neutron-654798d8cb-lq2fv\" (UID: \"6a6a790b-5626-41aa-994f-0c0740790a7d\") " pod="openstack/neutron-654798d8cb-lq2fv" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.337891 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvhk8\" (UniqueName: \"kubernetes.io/projected/6a6a790b-5626-41aa-994f-0c0740790a7d-kube-api-access-kvhk8\") pod \"neutron-654798d8cb-lq2fv\" (UID: \"6a6a790b-5626-41aa-994f-0c0740790a7d\") " pod="openstack/neutron-654798d8cb-lq2fv" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.371423 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-654798d8cb-lq2fv" Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.705474 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bde2d02b-758d-49cc-ba28-9501bbc7d0b0","Type":"ContainerStarted","Data":"6648e28a0aabf934a1e352939dbaefd515e610df9d551a2ba25b70f48f7d745c"} Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.739819 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"21a9837d-cc1d-4bf1-9b52-f9196880e367","Type":"ContainerStarted","Data":"aec8b45d4a02f335089dc2fc5a02c45ff92549e05156e4fcc31bf2abd8834fd1"} Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.742638 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="21a9837d-cc1d-4bf1-9b52-f9196880e367" containerName="glance-log" containerID="cri-o://e3a126dc953224f7cf69d43ba28034a6b6151fba211d245d8ef374996a56a5fa" gracePeriod=30 Dec 05 23:38:39 crc kubenswrapper[4734]: I1205 23:38:39.746668 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="21a9837d-cc1d-4bf1-9b52-f9196880e367" containerName="glance-httpd" containerID="cri-o://aec8b45d4a02f335089dc2fc5a02c45ff92549e05156e4fcc31bf2abd8834fd1" gracePeriod=30 Dec 05 23:38:40 crc kubenswrapper[4734]: I1205 23:38:40.033833 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=16.033799621 podStartE2EDuration="16.033799621s" podCreationTimestamp="2025-12-05 23:38:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:38:39.979986537 +0000 UTC m=+1140.663390833" watchObservedRunningTime="2025-12-05 23:38:40.033799621 +0000 UTC m=+1140.717203897" Dec 05 23:38:40 crc kubenswrapper[4734]: I1205 23:38:40.536191 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-gxrch"] Dec 05 23:38:40 crc kubenswrapper[4734]: I1205 23:38:40.760762 4734 generic.go:334] "Generic (PLEG): container finished" podID="21a9837d-cc1d-4bf1-9b52-f9196880e367" containerID="e3a126dc953224f7cf69d43ba28034a6b6151fba211d245d8ef374996a56a5fa" exitCode=143 Dec 05 23:38:40 crc kubenswrapper[4734]: I1205 23:38:40.760827 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"21a9837d-cc1d-4bf1-9b52-f9196880e367","Type":"ContainerDied","Data":"e3a126dc953224f7cf69d43ba28034a6b6151fba211d245d8ef374996a56a5fa"} Dec 05 23:38:40 crc kubenswrapper[4734]: I1205 23:38:40.765559 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-gxrch" event={"ID":"de3b13fb-9708-44af-bd09-f9be8514121e","Type":"ContainerStarted","Data":"139416cd88ae51b920b2e6aada6252c49461a7153b9e5477dcbc4e64e5e09f2f"} Dec 05 23:38:40 crc kubenswrapper[4734]: I1205 23:38:40.765817 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" podUID="c0867534-4785-4d2c-9be9-74f3dfa9fb3c" containerName="dnsmasq-dns" containerID="cri-o://3a3da532ecc1f2bbd6753ee7f360b54117553464d2fcdd547f49e10b768038df" gracePeriod=10 Dec 05 23:38:40 crc kubenswrapper[4734]: I1205 23:38:40.832289 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-654798d8cb-lq2fv"] Dec 05 23:38:41 crc kubenswrapper[4734]: I1205 23:38:41.799413 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-654798d8cb-lq2fv" event={"ID":"6a6a790b-5626-41aa-994f-0c0740790a7d","Type":"ContainerStarted","Data":"f8b554734d88d9c4942018b0b5319d9559c1d03041cdb1568268413735cf4883"} Dec 05 23:38:41 crc kubenswrapper[4734]: I1205 23:38:41.807878 4734 generic.go:334] "Generic (PLEG): container finished" podID="c0867534-4785-4d2c-9be9-74f3dfa9fb3c" containerID="3a3da532ecc1f2bbd6753ee7f360b54117553464d2fcdd547f49e10b768038df" exitCode=0 Dec 05 23:38:41 crc kubenswrapper[4734]: I1205 23:38:41.807945 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" event={"ID":"c0867534-4785-4d2c-9be9-74f3dfa9fb3c","Type":"ContainerDied","Data":"3a3da532ecc1f2bbd6753ee7f360b54117553464d2fcdd547f49e10b768038df"} Dec 05 23:38:41 crc kubenswrapper[4734]: I1205 23:38:41.867572 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-67d74d57d5-4s4p7"] Dec 05 23:38:41 crc kubenswrapper[4734]: I1205 23:38:41.874970 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67d74d57d5-4s4p7" Dec 05 23:38:41 crc kubenswrapper[4734]: I1205 23:38:41.882712 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 05 23:38:41 crc kubenswrapper[4734]: I1205 23:38:41.882967 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 05 23:38:41 crc kubenswrapper[4734]: I1205 23:38:41.898960 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:38:41 crc kubenswrapper[4734]: I1205 23:38:41.899074 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:38:41 crc kubenswrapper[4734]: I1205 23:38:41.899115 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67d74d57d5-4s4p7"] Dec 05 23:38:41 crc kubenswrapper[4734]: I1205 23:38:41.932295 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4201381-aab2-40da-9f4a-dc31e8874266-config\") pod \"neutron-67d74d57d5-4s4p7\" (UID: \"f4201381-aab2-40da-9f4a-dc31e8874266\") " pod="openstack/neutron-67d74d57d5-4s4p7" Dec 05 23:38:41 crc kubenswrapper[4734]: I1205 23:38:41.932399 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4201381-aab2-40da-9f4a-dc31e8874266-combined-ca-bundle\") pod \"neutron-67d74d57d5-4s4p7\" (UID: \"f4201381-aab2-40da-9f4a-dc31e8874266\") " pod="openstack/neutron-67d74d57d5-4s4p7" Dec 05 23:38:41 crc kubenswrapper[4734]: I1205 23:38:41.932450 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f4201381-aab2-40da-9f4a-dc31e8874266-httpd-config\") pod \"neutron-67d74d57d5-4s4p7\" (UID: \"f4201381-aab2-40da-9f4a-dc31e8874266\") " pod="openstack/neutron-67d74d57d5-4s4p7" Dec 05 23:38:41 crc kubenswrapper[4734]: I1205 23:38:41.932500 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4201381-aab2-40da-9f4a-dc31e8874266-internal-tls-certs\") pod \"neutron-67d74d57d5-4s4p7\" (UID: \"f4201381-aab2-40da-9f4a-dc31e8874266\") " pod="openstack/neutron-67d74d57d5-4s4p7" Dec 05 23:38:41 crc kubenswrapper[4734]: I1205 23:38:41.932575 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4201381-aab2-40da-9f4a-dc31e8874266-ovndb-tls-certs\") pod \"neutron-67d74d57d5-4s4p7\" (UID: \"f4201381-aab2-40da-9f4a-dc31e8874266\") " pod="openstack/neutron-67d74d57d5-4s4p7" Dec 05 23:38:41 crc kubenswrapper[4734]: I1205 23:38:41.932621 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4201381-aab2-40da-9f4a-dc31e8874266-public-tls-certs\") pod \"neutron-67d74d57d5-4s4p7\" (UID: \"f4201381-aab2-40da-9f4a-dc31e8874266\") " pod="openstack/neutron-67d74d57d5-4s4p7" Dec 05 23:38:41 crc kubenswrapper[4734]: I1205 23:38:41.932643 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz8bq\" (UniqueName: \"kubernetes.io/projected/f4201381-aab2-40da-9f4a-dc31e8874266-kube-api-access-lz8bq\") pod \"neutron-67d74d57d5-4s4p7\" (UID: \"f4201381-aab2-40da-9f4a-dc31e8874266\") " pod="openstack/neutron-67d74d57d5-4s4p7" Dec 05 23:38:42 crc kubenswrapper[4734]: I1205 23:38:42.006373 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-755fc898d8-dlnbz" Dec 05 23:38:42 crc kubenswrapper[4734]: I1205 23:38:42.006853 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-755fc898d8-dlnbz" Dec 05 23:38:42 crc kubenswrapper[4734]: I1205 23:38:42.035018 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f4201381-aab2-40da-9f4a-dc31e8874266-httpd-config\") pod \"neutron-67d74d57d5-4s4p7\" (UID: \"f4201381-aab2-40da-9f4a-dc31e8874266\") " pod="openstack/neutron-67d74d57d5-4s4p7" Dec 05 23:38:42 crc kubenswrapper[4734]: I1205 23:38:42.035134 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4201381-aab2-40da-9f4a-dc31e8874266-internal-tls-certs\") pod \"neutron-67d74d57d5-4s4p7\" (UID: \"f4201381-aab2-40da-9f4a-dc31e8874266\") " pod="openstack/neutron-67d74d57d5-4s4p7" Dec 05 23:38:42 crc kubenswrapper[4734]: I1205 23:38:42.035229 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4201381-aab2-40da-9f4a-dc31e8874266-ovndb-tls-certs\") pod \"neutron-67d74d57d5-4s4p7\" (UID: \"f4201381-aab2-40da-9f4a-dc31e8874266\") " pod="openstack/neutron-67d74d57d5-4s4p7" Dec 05 23:38:42 crc kubenswrapper[4734]: I1205 23:38:42.035315 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4201381-aab2-40da-9f4a-dc31e8874266-public-tls-certs\") pod \"neutron-67d74d57d5-4s4p7\" (UID: \"f4201381-aab2-40da-9f4a-dc31e8874266\") " pod="openstack/neutron-67d74d57d5-4s4p7" Dec 05 23:38:42 crc kubenswrapper[4734]: I1205 23:38:42.035360 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz8bq\" (UniqueName: \"kubernetes.io/projected/f4201381-aab2-40da-9f4a-dc31e8874266-kube-api-access-lz8bq\") pod \"neutron-67d74d57d5-4s4p7\" (UID: \"f4201381-aab2-40da-9f4a-dc31e8874266\") " pod="openstack/neutron-67d74d57d5-4s4p7" Dec 05 23:38:42 crc kubenswrapper[4734]: I1205 23:38:42.035462 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4201381-aab2-40da-9f4a-dc31e8874266-config\") pod \"neutron-67d74d57d5-4s4p7\" (UID: \"f4201381-aab2-40da-9f4a-dc31e8874266\") " pod="openstack/neutron-67d74d57d5-4s4p7" Dec 05 23:38:42 crc kubenswrapper[4734]: I1205 23:38:42.035495 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4201381-aab2-40da-9f4a-dc31e8874266-combined-ca-bundle\") pod \"neutron-67d74d57d5-4s4p7\" (UID: \"f4201381-aab2-40da-9f4a-dc31e8874266\") " pod="openstack/neutron-67d74d57d5-4s4p7" Dec 05 23:38:42 crc kubenswrapper[4734]: I1205 23:38:42.049439 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f4201381-aab2-40da-9f4a-dc31e8874266-httpd-config\") pod \"neutron-67d74d57d5-4s4p7\" (UID: \"f4201381-aab2-40da-9f4a-dc31e8874266\") " pod="openstack/neutron-67d74d57d5-4s4p7" Dec 05 23:38:42 crc kubenswrapper[4734]: I1205 23:38:42.049927 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4201381-aab2-40da-9f4a-dc31e8874266-internal-tls-certs\") pod \"neutron-67d74d57d5-4s4p7\" (UID: \"f4201381-aab2-40da-9f4a-dc31e8874266\") " pod="openstack/neutron-67d74d57d5-4s4p7" Dec 05 23:38:42 crc kubenswrapper[4734]: I1205 23:38:42.051172 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4201381-aab2-40da-9f4a-dc31e8874266-public-tls-certs\") pod \"neutron-67d74d57d5-4s4p7\" (UID: \"f4201381-aab2-40da-9f4a-dc31e8874266\") " pod="openstack/neutron-67d74d57d5-4s4p7" Dec 05 23:38:42 crc kubenswrapper[4734]: I1205 23:38:42.051468 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4201381-aab2-40da-9f4a-dc31e8874266-config\") pod \"neutron-67d74d57d5-4s4p7\" (UID: \"f4201381-aab2-40da-9f4a-dc31e8874266\") " pod="openstack/neutron-67d74d57d5-4s4p7" Dec 05 23:38:42 crc kubenswrapper[4734]: I1205 23:38:42.056458 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4201381-aab2-40da-9f4a-dc31e8874266-ovndb-tls-certs\") pod \"neutron-67d74d57d5-4s4p7\" (UID: \"f4201381-aab2-40da-9f4a-dc31e8874266\") " pod="openstack/neutron-67d74d57d5-4s4p7" Dec 05 23:38:42 crc kubenswrapper[4734]: I1205 23:38:42.085733 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz8bq\" (UniqueName: \"kubernetes.io/projected/f4201381-aab2-40da-9f4a-dc31e8874266-kube-api-access-lz8bq\") pod \"neutron-67d74d57d5-4s4p7\" (UID: \"f4201381-aab2-40da-9f4a-dc31e8874266\") " pod="openstack/neutron-67d74d57d5-4s4p7" Dec 05 23:38:42 crc kubenswrapper[4734]: I1205 23:38:42.090143 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4201381-aab2-40da-9f4a-dc31e8874266-combined-ca-bundle\") pod \"neutron-67d74d57d5-4s4p7\" (UID: \"f4201381-aab2-40da-9f4a-dc31e8874266\") " pod="openstack/neutron-67d74d57d5-4s4p7" Dec 05 23:38:42 crc kubenswrapper[4734]: I1205 23:38:42.247058 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67d74d57d5-4s4p7" Dec 05 23:38:42 crc kubenswrapper[4734]: I1205 23:38:42.828996 4734 generic.go:334] "Generic (PLEG): container finished" podID="de3b13fb-9708-44af-bd09-f9be8514121e" containerID="b58110ceaa8ee50a0007f940b537cb62ec0f23aa523c39b26941fb5154296dc7" exitCode=0 Dec 05 23:38:42 crc kubenswrapper[4734]: I1205 23:38:42.829496 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-gxrch" event={"ID":"de3b13fb-9708-44af-bd09-f9be8514121e","Type":"ContainerDied","Data":"b58110ceaa8ee50a0007f940b537cb62ec0f23aa523c39b26941fb5154296dc7"} Dec 05 23:38:42 crc kubenswrapper[4734]: I1205 23:38:42.841125 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bde2d02b-758d-49cc-ba28-9501bbc7d0b0","Type":"ContainerStarted","Data":"bacff3262697fa635b03db0fd39e556f123493684c24719f3e7e77ac987df7c0"} Dec 05 23:38:42 crc kubenswrapper[4734]: I1205 23:38:42.841374 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bde2d02b-758d-49cc-ba28-9501bbc7d0b0" containerName="glance-log" containerID="cri-o://6648e28a0aabf934a1e352939dbaefd515e610df9d551a2ba25b70f48f7d745c" gracePeriod=30 Dec 05 23:38:42 crc kubenswrapper[4734]: I1205 23:38:42.841747 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bde2d02b-758d-49cc-ba28-9501bbc7d0b0" containerName="glance-httpd" containerID="cri-o://bacff3262697fa635b03db0fd39e556f123493684c24719f3e7e77ac987df7c0" gracePeriod=30 Dec 05 23:38:42 crc kubenswrapper[4734]: I1205 23:38:42.849708 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-654798d8cb-lq2fv" event={"ID":"6a6a790b-5626-41aa-994f-0c0740790a7d","Type":"ContainerStarted","Data":"2d13d89d313484f007afec1ce6d2ebd5001f4bd6907f6366fc11c7dec8e9b6b1"} Dec 05 23:38:42 crc kubenswrapper[4734]: I1205 23:38:42.852861 4734 generic.go:334] "Generic (PLEG): container finished" podID="21a9837d-cc1d-4bf1-9b52-f9196880e367" containerID="aec8b45d4a02f335089dc2fc5a02c45ff92549e05156e4fcc31bf2abd8834fd1" exitCode=0 Dec 05 23:38:42 crc kubenswrapper[4734]: I1205 23:38:42.852953 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"21a9837d-cc1d-4bf1-9b52-f9196880e367","Type":"ContainerDied","Data":"aec8b45d4a02f335089dc2fc5a02c45ff92549e05156e4fcc31bf2abd8834fd1"} Dec 05 23:38:42 crc kubenswrapper[4734]: I1205 23:38:42.901313 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=18.901274086 podStartE2EDuration="18.901274086s" podCreationTimestamp="2025-12-05 23:38:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:38:42.8919882 +0000 UTC m=+1143.575392496" watchObservedRunningTime="2025-12-05 23:38:42.901274086 +0000 UTC m=+1143.584678362" Dec 05 23:38:43 crc kubenswrapper[4734]: I1205 23:38:43.238112 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-c55dfd787-8xfc2" Dec 05 23:38:43 crc kubenswrapper[4734]: I1205 23:38:43.868426 4734 generic.go:334] "Generic (PLEG): container finished" podID="bde2d02b-758d-49cc-ba28-9501bbc7d0b0" containerID="bacff3262697fa635b03db0fd39e556f123493684c24719f3e7e77ac987df7c0" exitCode=0 Dec 05 23:38:43 crc kubenswrapper[4734]: I1205 23:38:43.868473 4734 generic.go:334] "Generic (PLEG): container finished" podID="bde2d02b-758d-49cc-ba28-9501bbc7d0b0" containerID="6648e28a0aabf934a1e352939dbaefd515e610df9d551a2ba25b70f48f7d745c" exitCode=143 Dec 05 23:38:43 crc kubenswrapper[4734]: I1205 23:38:43.868513 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bde2d02b-758d-49cc-ba28-9501bbc7d0b0","Type":"ContainerDied","Data":"bacff3262697fa635b03db0fd39e556f123493684c24719f3e7e77ac987df7c0"} Dec 05 23:38:43 crc kubenswrapper[4734]: I1205 23:38:43.868587 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bde2d02b-758d-49cc-ba28-9501bbc7d0b0","Type":"ContainerDied","Data":"6648e28a0aabf934a1e352939dbaefd515e610df9d551a2ba25b70f48f7d745c"} Dec 05 23:38:44 crc kubenswrapper[4734]: I1205 23:38:44.881441 4734 generic.go:334] "Generic (PLEG): container finished" podID="df84fed8-d899-47ed-a702-2fbae2f75d53" containerID="ecaea013bad616b1d169bf69540826a36097fec2f10acbe2671b8ebb2f430d19" exitCode=0 Dec 05 23:38:44 crc kubenswrapper[4734]: I1205 23:38:44.881602 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zhc67" event={"ID":"df84fed8-d899-47ed-a702-2fbae2f75d53","Type":"ContainerDied","Data":"ecaea013bad616b1d169bf69540826a36097fec2f10acbe2671b8ebb2f430d19"} Dec 05 23:38:44 crc kubenswrapper[4734]: I1205 23:38:44.884167 4734 generic.go:334] "Generic (PLEG): container finished" podID="e97c0b2c-1294-43eb-a424-5c04e198611e" containerID="7f71bf11891772dbfb88764495aa12ed3fc0c9cfca8e941570c5f0658deb175b" exitCode=0 Dec 05 23:38:44 crc kubenswrapper[4734]: I1205 23:38:44.884210 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6x54r" event={"ID":"e97c0b2c-1294-43eb-a424-5c04e198611e","Type":"ContainerDied","Data":"7f71bf11891772dbfb88764495aa12ed3fc0c9cfca8e941570c5f0658deb175b"} Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.197490 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.243176 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.351330 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zhc67" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.374472 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6x54r" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.390014 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e97c0b2c-1294-43eb-a424-5c04e198611e-combined-ca-bundle\") pod \"e97c0b2c-1294-43eb-a424-5c04e198611e\" (UID: \"e97c0b2c-1294-43eb-a424-5c04e198611e\") " Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.390094 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e97c0b2c-1294-43eb-a424-5c04e198611e-config-data\") pod \"e97c0b2c-1294-43eb-a424-5c04e198611e\" (UID: \"e97c0b2c-1294-43eb-a424-5c04e198611e\") " Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.390160 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqtfj\" (UniqueName: \"kubernetes.io/projected/e97c0b2c-1294-43eb-a424-5c04e198611e-kube-api-access-xqtfj\") pod \"e97c0b2c-1294-43eb-a424-5c04e198611e\" (UID: \"e97c0b2c-1294-43eb-a424-5c04e198611e\") " Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.390193 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e97c0b2c-1294-43eb-a424-5c04e198611e-scripts\") pod \"e97c0b2c-1294-43eb-a424-5c04e198611e\" (UID: \"e97c0b2c-1294-43eb-a424-5c04e198611e\") " Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.390225 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21a9837d-cc1d-4bf1-9b52-f9196880e367-config-data\") pod \"21a9837d-cc1d-4bf1-9b52-f9196880e367\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") " Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.390252 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56r2c\" (UniqueName: \"kubernetes.io/projected/21a9837d-cc1d-4bf1-9b52-f9196880e367-kube-api-access-56r2c\") pod \"21a9837d-cc1d-4bf1-9b52-f9196880e367\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") " Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.390286 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"21a9837d-cc1d-4bf1-9b52-f9196880e367\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") " Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.390320 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/21a9837d-cc1d-4bf1-9b52-f9196880e367-httpd-run\") pod \"21a9837d-cc1d-4bf1-9b52-f9196880e367\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") " Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.390382 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e97c0b2c-1294-43eb-a424-5c04e198611e-logs\") pod \"e97c0b2c-1294-43eb-a424-5c04e198611e\" (UID: \"e97c0b2c-1294-43eb-a424-5c04e198611e\") " Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.390437 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-ovsdbserver-sb\") pod \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\" (UID: \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\") " Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.390518 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-combined-ca-bundle\") pod \"df84fed8-d899-47ed-a702-2fbae2f75d53\" (UID: \"df84fed8-d899-47ed-a702-2fbae2f75d53\") " Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.390598 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-credential-keys\") pod \"df84fed8-d899-47ed-a702-2fbae2f75d53\" (UID: \"df84fed8-d899-47ed-a702-2fbae2f75d53\") " Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.390629 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a9837d-cc1d-4bf1-9b52-f9196880e367-combined-ca-bundle\") pod \"21a9837d-cc1d-4bf1-9b52-f9196880e367\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") " Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.390664 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6sht\" (UniqueName: \"kubernetes.io/projected/df84fed8-d899-47ed-a702-2fbae2f75d53-kube-api-access-m6sht\") pod \"df84fed8-d899-47ed-a702-2fbae2f75d53\" (UID: \"df84fed8-d899-47ed-a702-2fbae2f75d53\") " Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.390705 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-dns-svc\") pod \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\" (UID: \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\") " Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.390740 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21a9837d-cc1d-4bf1-9b52-f9196880e367-logs\") pod \"21a9837d-cc1d-4bf1-9b52-f9196880e367\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") " Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.390808 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-ovsdbserver-nb\") pod \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\" (UID: \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\") " Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.390833 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-config\") pod \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\" (UID: \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\") " Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.390898 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-scripts\") pod \"df84fed8-d899-47ed-a702-2fbae2f75d53\" (UID: \"df84fed8-d899-47ed-a702-2fbae2f75d53\") " Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.390941 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-fernet-keys\") pod \"df84fed8-d899-47ed-a702-2fbae2f75d53\" (UID: \"df84fed8-d899-47ed-a702-2fbae2f75d53\") " Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.390979 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-config-data\") pod \"df84fed8-d899-47ed-a702-2fbae2f75d53\" (UID: \"df84fed8-d899-47ed-a702-2fbae2f75d53\") " Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.391034 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx688\" (UniqueName: \"kubernetes.io/projected/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-kube-api-access-cx688\") pod \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\" (UID: \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\") " Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.391076 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-dns-swift-storage-0\") pod \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\" (UID: \"c0867534-4785-4d2c-9be9-74f3dfa9fb3c\") " Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.391101 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21a9837d-cc1d-4bf1-9b52-f9196880e367-scripts\") pod \"21a9837d-cc1d-4bf1-9b52-f9196880e367\" (UID: \"21a9837d-cc1d-4bf1-9b52-f9196880e367\") " Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.401807 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21a9837d-cc1d-4bf1-9b52-f9196880e367-scripts" (OuterVolumeSpecName: "scripts") pod "21a9837d-cc1d-4bf1-9b52-f9196880e367" (UID: "21a9837d-cc1d-4bf1-9b52-f9196880e367"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.418788 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e97c0b2c-1294-43eb-a424-5c04e198611e-logs" (OuterVolumeSpecName: "logs") pod "e97c0b2c-1294-43eb-a424-5c04e198611e" (UID: "e97c0b2c-1294-43eb-a424-5c04e198611e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.442886 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21a9837d-cc1d-4bf1-9b52-f9196880e367-logs" (OuterVolumeSpecName: "logs") pod "21a9837d-cc1d-4bf1-9b52-f9196880e367" (UID: "21a9837d-cc1d-4bf1-9b52-f9196880e367"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.443488 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21a9837d-cc1d-4bf1-9b52-f9196880e367-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "21a9837d-cc1d-4bf1-9b52-f9196880e367" (UID: "21a9837d-cc1d-4bf1-9b52-f9196880e367"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.481850 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "df84fed8-d899-47ed-a702-2fbae2f75d53" (UID: "df84fed8-d899-47ed-a702-2fbae2f75d53"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.482053 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21a9837d-cc1d-4bf1-9b52-f9196880e367-kube-api-access-56r2c" (OuterVolumeSpecName: "kube-api-access-56r2c") pod "21a9837d-cc1d-4bf1-9b52-f9196880e367" (UID: "21a9837d-cc1d-4bf1-9b52-f9196880e367"). InnerVolumeSpecName "kube-api-access-56r2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.496414 4734 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21a9837d-cc1d-4bf1-9b52-f9196880e367-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.496449 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56r2c\" (UniqueName: \"kubernetes.io/projected/21a9837d-cc1d-4bf1-9b52-f9196880e367-kube-api-access-56r2c\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.496461 4734 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/21a9837d-cc1d-4bf1-9b52-f9196880e367-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.496469 4734 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e97c0b2c-1294-43eb-a424-5c04e198611e-logs\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.496478 4734 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.496488 4734 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21a9837d-cc1d-4bf1-9b52-f9196880e367-logs\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.543482 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "df84fed8-d899-47ed-a702-2fbae2f75d53" (UID: "df84fed8-d899-47ed-a702-2fbae2f75d53"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.545945 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df84fed8-d899-47ed-a702-2fbae2f75d53-kube-api-access-m6sht" (OuterVolumeSpecName: "kube-api-access-m6sht") pod "df84fed8-d899-47ed-a702-2fbae2f75d53" (UID: "df84fed8-d899-47ed-a702-2fbae2f75d53"). InnerVolumeSpecName "kube-api-access-m6sht". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.546855 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-scripts" (OuterVolumeSpecName: "scripts") pod "df84fed8-d899-47ed-a702-2fbae2f75d53" (UID: "df84fed8-d899-47ed-a702-2fbae2f75d53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.546952 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e97c0b2c-1294-43eb-a424-5c04e198611e-scripts" (OuterVolumeSpecName: "scripts") pod "e97c0b2c-1294-43eb-a424-5c04e198611e" (UID: "e97c0b2c-1294-43eb-a424-5c04e198611e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.547136 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e97c0b2c-1294-43eb-a424-5c04e198611e-kube-api-access-xqtfj" (OuterVolumeSpecName: "kube-api-access-xqtfj") pod "e97c0b2c-1294-43eb-a424-5c04e198611e" (UID: "e97c0b2c-1294-43eb-a424-5c04e198611e"). InnerVolumeSpecName "kube-api-access-xqtfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.551568 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "21a9837d-cc1d-4bf1-9b52-f9196880e367" (UID: "21a9837d-cc1d-4bf1-9b52-f9196880e367"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.583902 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-kube-api-access-cx688" (OuterVolumeSpecName: "kube-api-access-cx688") pod "c0867534-4785-4d2c-9be9-74f3dfa9fb3c" (UID: "c0867534-4785-4d2c-9be9-74f3dfa9fb3c"). InnerVolumeSpecName "kube-api-access-cx688". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.598465 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6sht\" (UniqueName: \"kubernetes.io/projected/df84fed8-d899-47ed-a702-2fbae2f75d53-kube-api-access-m6sht\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.598506 4734 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.598518 4734 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.598548 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cx688\" (UniqueName: \"kubernetes.io/projected/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-kube-api-access-cx688\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.598558 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqtfj\" (UniqueName: \"kubernetes.io/projected/e97c0b2c-1294-43eb-a424-5c04e198611e-kube-api-access-xqtfj\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.598566 4734 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e97c0b2c-1294-43eb-a424-5c04e198611e-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.598593 4734 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.609171 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e97c0b2c-1294-43eb-a424-5c04e198611e-config-data" (OuterVolumeSpecName: "config-data") pod "e97c0b2c-1294-43eb-a424-5c04e198611e" (UID: "e97c0b2c-1294-43eb-a424-5c04e198611e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.703651 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e97c0b2c-1294-43eb-a424-5c04e198611e-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.797886 4734 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.805138 4734 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.877756 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21a9837d-cc1d-4bf1-9b52-f9196880e367-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21a9837d-cc1d-4bf1-9b52-f9196880e367" (UID: "21a9837d-cc1d-4bf1-9b52-f9196880e367"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.912192 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a9837d-cc1d-4bf1-9b52-f9196880e367-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.916833 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-config-data" (OuterVolumeSpecName: "config-data") pod "df84fed8-d899-47ed-a702-2fbae2f75d53" (UID: "df84fed8-d899-47ed-a702-2fbae2f75d53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.947904 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e97c0b2c-1294-43eb-a424-5c04e198611e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e97c0b2c-1294-43eb-a424-5c04e198611e" (UID: "e97c0b2c-1294-43eb-a424-5c04e198611e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.970205 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"21a9837d-cc1d-4bf1-9b52-f9196880e367","Type":"ContainerDied","Data":"768dc09f91238fb7ab263092d730b45a2a7f8de3fa4b62d2cd1007fa03d2bf9d"} Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.970286 4734 scope.go:117] "RemoveContainer" containerID="aec8b45d4a02f335089dc2fc5a02c45ff92549e05156e4fcc31bf2abd8834fd1" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.970445 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.990636 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6x54r" event={"ID":"e97c0b2c-1294-43eb-a424-5c04e198611e","Type":"ContainerDied","Data":"d90e61e0737d359b9174289e510067cdf3025e9e63d6c3410087a8753bc8b23b"} Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.990746 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d90e61e0737d359b9174289e510067cdf3025e9e63d6c3410087a8753bc8b23b" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.990732 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6x54r" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.997513 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zhc67" event={"ID":"df84fed8-d899-47ed-a702-2fbae2f75d53","Type":"ContainerDied","Data":"057ebedd054cdb0c0a0560ae9ad0cceae01b9fe37a03c6f064623499f8f9c680"} Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.997595 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="057ebedd054cdb0c0a0560ae9ad0cceae01b9fe37a03c6f064623499f8f9c680" Dec 05 23:38:47 crc kubenswrapper[4734]: I1205 23:38:47.997753 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zhc67" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.004075 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" event={"ID":"c0867534-4785-4d2c-9be9-74f3dfa9fb3c","Type":"ContainerDied","Data":"96520f07ec756c4b57510ed6017fbec9447c2404a0dc98819b6172f6771cc2db"} Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.004293 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" Dec 05 23:38:48 crc kubenswrapper[4734]: E1205 23:38:48.062387 4734 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21a9837d_cc1d_4bf1_9b52_f9196880e367.slice/crio-e3a126dc953224f7cf69d43ba28034a6b6151fba211d245d8ef374996a56a5fa.scope\": RecentStats: unable to find data in memory cache]" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.065790 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.065927 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e97c0b2c-1294-43eb-a424-5c04e198611e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.139417 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df84fed8-d899-47ed-a702-2fbae2f75d53" (UID: "df84fed8-d899-47ed-a702-2fbae2f75d53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.147603 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c0867534-4785-4d2c-9be9-74f3dfa9fb3c" (UID: "c0867534-4785-4d2c-9be9-74f3dfa9fb3c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.170437 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df84fed8-d899-47ed-a702-2fbae2f75d53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.170479 4734 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.171885 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-config" (OuterVolumeSpecName: "config") pod "c0867534-4785-4d2c-9be9-74f3dfa9fb3c" (UID: "c0867534-4785-4d2c-9be9-74f3dfa9fb3c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.179117 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c0867534-4785-4d2c-9be9-74f3dfa9fb3c" (UID: "c0867534-4785-4d2c-9be9-74f3dfa9fb3c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.184611 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c0867534-4785-4d2c-9be9-74f3dfa9fb3c" (UID: "c0867534-4785-4d2c-9be9-74f3dfa9fb3c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.187424 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c0867534-4785-4d2c-9be9-74f3dfa9fb3c" (UID: "c0867534-4785-4d2c-9be9-74f3dfa9fb3c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.227709 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21a9837d-cc1d-4bf1-9b52-f9196880e367-config-data" (OuterVolumeSpecName: "config-data") pod "21a9837d-cc1d-4bf1-9b52-f9196880e367" (UID: "21a9837d-cc1d-4bf1-9b52-f9196880e367"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.276805 4734 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.276852 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.276869 4734 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.276921 4734 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0867534-4785-4d2c-9be9-74f3dfa9fb3c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.276939 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21a9837d-cc1d-4bf1-9b52-f9196880e367-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.355648 4734 scope.go:117] "RemoveContainer" containerID="e3a126dc953224f7cf69d43ba28034a6b6151fba211d245d8ef374996a56a5fa" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.361737 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67d74d57d5-4s4p7"] Dec 05 23:38:48 crc kubenswrapper[4734]: W1205 23:38:48.384685 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4201381_aab2_40da_9f4a_dc31e8874266.slice/crio-3ee0b14d707e439db8d92653d25980317b4088a4418c8693fcfacd3dea781bf6 WatchSource:0}: Error finding container 3ee0b14d707e439db8d92653d25980317b4088a4418c8693fcfacd3dea781bf6: Status 404 returned error can't find the container with id 3ee0b14d707e439db8d92653d25980317b4088a4418c8693fcfacd3dea781bf6 Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.392590 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.428819 4734 scope.go:117] "RemoveContainer" containerID="3a3da532ecc1f2bbd6753ee7f360b54117553464d2fcdd547f49e10b768038df" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.447081 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-s4pbw"] Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.465295 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-s4pbw"] Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.482118 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-logs\") pod \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") " Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.482204 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") " Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.482281 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhfr4\" (UniqueName: \"kubernetes.io/projected/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-kube-api-access-bhfr4\") pod \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") " Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.482382 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-httpd-run\") pod \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") " Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.482450 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-config-data\") pod \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") " Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.482477 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-combined-ca-bundle\") pod \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") " Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.482545 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-scripts\") pod \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\" (UID: \"bde2d02b-758d-49cc-ba28-9501bbc7d0b0\") " Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.491268 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-logs" (OuterVolumeSpecName: "logs") pod "bde2d02b-758d-49cc-ba28-9501bbc7d0b0" (UID: "bde2d02b-758d-49cc-ba28-9501bbc7d0b0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.491729 4734 scope.go:117] "RemoveContainer" containerID="e166bf2220bc5c55b26024811c8aa337944ca3c4a8166e0fbee4f218d22b925e" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.492234 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bde2d02b-758d-49cc-ba28-9501bbc7d0b0" (UID: "bde2d02b-758d-49cc-ba28-9501bbc7d0b0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.493027 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-scripts" (OuterVolumeSpecName: "scripts") pod "bde2d02b-758d-49cc-ba28-9501bbc7d0b0" (UID: "bde2d02b-758d-49cc-ba28-9501bbc7d0b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.496858 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.515955 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "bde2d02b-758d-49cc-ba28-9501bbc7d0b0" (UID: "bde2d02b-758d-49cc-ba28-9501bbc7d0b0"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.516340 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-kube-api-access-bhfr4" (OuterVolumeSpecName: "kube-api-access-bhfr4") pod "bde2d02b-758d-49cc-ba28-9501bbc7d0b0" (UID: "bde2d02b-758d-49cc-ba28-9501bbc7d0b0"). InnerVolumeSpecName "kube-api-access-bhfr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.556076 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.570555 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 23:38:48 crc kubenswrapper[4734]: E1205 23:38:48.571108 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0867534-4785-4d2c-9be9-74f3dfa9fb3c" containerName="dnsmasq-dns" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.571504 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0867534-4785-4d2c-9be9-74f3dfa9fb3c" containerName="dnsmasq-dns" Dec 05 23:38:48 crc kubenswrapper[4734]: E1205 23:38:48.571546 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df84fed8-d899-47ed-a702-2fbae2f75d53" containerName="keystone-bootstrap" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.571555 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="df84fed8-d899-47ed-a702-2fbae2f75d53" containerName="keystone-bootstrap" Dec 05 23:38:48 crc kubenswrapper[4734]: E1205 23:38:48.571571 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde2d02b-758d-49cc-ba28-9501bbc7d0b0" containerName="glance-httpd" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.571578 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde2d02b-758d-49cc-ba28-9501bbc7d0b0" containerName="glance-httpd" Dec 05 23:38:48 crc kubenswrapper[4734]: E1205 23:38:48.571600 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21a9837d-cc1d-4bf1-9b52-f9196880e367" containerName="glance-log" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.571609 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="21a9837d-cc1d-4bf1-9b52-f9196880e367" containerName="glance-log" Dec 05 23:38:48 crc kubenswrapper[4734]: E1205 23:38:48.571632 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21a9837d-cc1d-4bf1-9b52-f9196880e367" containerName="glance-httpd" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.571640 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="21a9837d-cc1d-4bf1-9b52-f9196880e367" containerName="glance-httpd" Dec 05 23:38:48 crc kubenswrapper[4734]: E1205 23:38:48.571650 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde2d02b-758d-49cc-ba28-9501bbc7d0b0" containerName="glance-log" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.571657 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde2d02b-758d-49cc-ba28-9501bbc7d0b0" containerName="glance-log" Dec 05 23:38:48 crc kubenswrapper[4734]: E1205 23:38:48.571672 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e97c0b2c-1294-43eb-a424-5c04e198611e" containerName="placement-db-sync" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.571678 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="e97c0b2c-1294-43eb-a424-5c04e198611e" containerName="placement-db-sync" Dec 05 23:38:48 crc kubenswrapper[4734]: E1205 23:38:48.571694 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0867534-4785-4d2c-9be9-74f3dfa9fb3c" containerName="init" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.571701 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0867534-4785-4d2c-9be9-74f3dfa9fb3c" containerName="init" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.578016 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="bde2d02b-758d-49cc-ba28-9501bbc7d0b0" containerName="glance-httpd" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.578101 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="21a9837d-cc1d-4bf1-9b52-f9196880e367" containerName="glance-log" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.578134 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0867534-4785-4d2c-9be9-74f3dfa9fb3c" containerName="dnsmasq-dns" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.578150 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="21a9837d-cc1d-4bf1-9b52-f9196880e367" containerName="glance-httpd" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.578179 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="bde2d02b-758d-49cc-ba28-9501bbc7d0b0" containerName="glance-log" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.578209 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="e97c0b2c-1294-43eb-a424-5c04e198611e" containerName="placement-db-sync" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.578226 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="df84fed8-d899-47ed-a702-2fbae2f75d53" containerName="keystone-bootstrap" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.584640 4734 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-logs\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.584718 4734 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.584733 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhfr4\" (UniqueName: \"kubernetes.io/projected/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-kube-api-access-bhfr4\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.584744 4734 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.584754 4734 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.586906 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.586990 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bde2d02b-758d-49cc-ba28-9501bbc7d0b0" (UID: "bde2d02b-758d-49cc-ba28-9501bbc7d0b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.589748 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.598424 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.656608 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.689038 4734 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.697733 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.697777 4734 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.714992 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-config-data" (OuterVolumeSpecName: "config-data") pod "bde2d02b-758d-49cc-ba28-9501bbc7d0b0" (UID: "bde2d02b-758d-49cc-ba28-9501bbc7d0b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.786294 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-fffd48d8f-srcmr"] Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.787850 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fffd48d8f-srcmr" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.789997 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.797030 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.797304 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qf7c2" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.798540 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.800729 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af968080-3e37-4034-90a7-0b654e68ee89-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.800908 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af968080-3e37-4034-90a7-0b654e68ee89-config-data\") pod \"glance-default-internal-api-0\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.800955 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af968080-3e37-4034-90a7-0b654e68ee89-scripts\") pod \"glance-default-internal-api-0\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.801013 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd4fw\" (UniqueName: \"kubernetes.io/projected/af968080-3e37-4034-90a7-0b654e68ee89-kube-api-access-zd4fw\") pod \"glance-default-internal-api-0\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.801096 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af968080-3e37-4034-90a7-0b654e68ee89-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.801186 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af968080-3e37-4034-90a7-0b654e68ee89-logs\") pod \"glance-default-internal-api-0\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.801325 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af968080-3e37-4034-90a7-0b654e68ee89-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.801355 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.801551 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bde2d02b-758d-49cc-ba28-9501bbc7d0b0-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.801780 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.802059 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.812598 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-85fbcb99c8-4gdvt"] Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.815409 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-85fbcb99c8-4gdvt" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.820106 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.820702 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-9bmgt" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.821007 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.821392 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.821662 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.862554 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-fffd48d8f-srcmr"] Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.883692 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-85fbcb99c8-4gdvt"] Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.904711 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af968080-3e37-4034-90a7-0b654e68ee89-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.904802 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26447265-57c1-45c6-bbef-cf7b2a82ed85-public-tls-certs\") pod \"keystone-fffd48d8f-srcmr\" (UID: \"26447265-57c1-45c6-bbef-cf7b2a82ed85\") " pod="openstack/keystone-fffd48d8f-srcmr" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.904842 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af968080-3e37-4034-90a7-0b654e68ee89-logs\") pod \"glance-default-internal-api-0\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.904870 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccbbcfb6-1ffd-4c8e-8945-9d496467e46a-public-tls-certs\") pod \"placement-85fbcb99c8-4gdvt\" (UID: \"ccbbcfb6-1ffd-4c8e-8945-9d496467e46a\") " pod="openstack/placement-85fbcb99c8-4gdvt" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.904895 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af968080-3e37-4034-90a7-0b654e68ee89-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.904915 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.904969 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccbbcfb6-1ffd-4c8e-8945-9d496467e46a-combined-ca-bundle\") pod \"placement-85fbcb99c8-4gdvt\" (UID: \"ccbbcfb6-1ffd-4c8e-8945-9d496467e46a\") " pod="openstack/placement-85fbcb99c8-4gdvt" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.904996 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccbbcfb6-1ffd-4c8e-8945-9d496467e46a-config-data\") pod \"placement-85fbcb99c8-4gdvt\" (UID: \"ccbbcfb6-1ffd-4c8e-8945-9d496467e46a\") " pod="openstack/placement-85fbcb99c8-4gdvt" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.905015 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/26447265-57c1-45c6-bbef-cf7b2a82ed85-credential-keys\") pod \"keystone-fffd48d8f-srcmr\" (UID: \"26447265-57c1-45c6-bbef-cf7b2a82ed85\") " pod="openstack/keystone-fffd48d8f-srcmr" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.905042 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26447265-57c1-45c6-bbef-cf7b2a82ed85-config-data\") pod \"keystone-fffd48d8f-srcmr\" (UID: \"26447265-57c1-45c6-bbef-cf7b2a82ed85\") " pod="openstack/keystone-fffd48d8f-srcmr" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.905078 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af968080-3e37-4034-90a7-0b654e68ee89-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.905100 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26447265-57c1-45c6-bbef-cf7b2a82ed85-combined-ca-bundle\") pod \"keystone-fffd48d8f-srcmr\" (UID: \"26447265-57c1-45c6-bbef-cf7b2a82ed85\") " pod="openstack/keystone-fffd48d8f-srcmr" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.905121 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccbbcfb6-1ffd-4c8e-8945-9d496467e46a-logs\") pod \"placement-85fbcb99c8-4gdvt\" (UID: \"ccbbcfb6-1ffd-4c8e-8945-9d496467e46a\") " pod="openstack/placement-85fbcb99c8-4gdvt" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.905151 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26447265-57c1-45c6-bbef-cf7b2a82ed85-scripts\") pod \"keystone-fffd48d8f-srcmr\" (UID: \"26447265-57c1-45c6-bbef-cf7b2a82ed85\") " pod="openstack/keystone-fffd48d8f-srcmr" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.905175 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/26447265-57c1-45c6-bbef-cf7b2a82ed85-fernet-keys\") pod \"keystone-fffd48d8f-srcmr\" (UID: \"26447265-57c1-45c6-bbef-cf7b2a82ed85\") " pod="openstack/keystone-fffd48d8f-srcmr" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.905403 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af968080-3e37-4034-90a7-0b654e68ee89-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.906274 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcmj2\" (UniqueName: \"kubernetes.io/projected/26447265-57c1-45c6-bbef-cf7b2a82ed85-kube-api-access-xcmj2\") pod \"keystone-fffd48d8f-srcmr\" (UID: \"26447265-57c1-45c6-bbef-cf7b2a82ed85\") " pod="openstack/keystone-fffd48d8f-srcmr" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.906376 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvf8z\" (UniqueName: \"kubernetes.io/projected/ccbbcfb6-1ffd-4c8e-8945-9d496467e46a-kube-api-access-qvf8z\") pod \"placement-85fbcb99c8-4gdvt\" (UID: \"ccbbcfb6-1ffd-4c8e-8945-9d496467e46a\") " pod="openstack/placement-85fbcb99c8-4gdvt" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.906492 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af968080-3e37-4034-90a7-0b654e68ee89-config-data\") pod \"glance-default-internal-api-0\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.906545 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af968080-3e37-4034-90a7-0b654e68ee89-scripts\") pod \"glance-default-internal-api-0\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.906596 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd4fw\" (UniqueName: \"kubernetes.io/projected/af968080-3e37-4034-90a7-0b654e68ee89-kube-api-access-zd4fw\") pod \"glance-default-internal-api-0\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.906622 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccbbcfb6-1ffd-4c8e-8945-9d496467e46a-scripts\") pod \"placement-85fbcb99c8-4gdvt\" (UID: \"ccbbcfb6-1ffd-4c8e-8945-9d496467e46a\") " pod="openstack/placement-85fbcb99c8-4gdvt" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.906650 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccbbcfb6-1ffd-4c8e-8945-9d496467e46a-internal-tls-certs\") pod \"placement-85fbcb99c8-4gdvt\" (UID: \"ccbbcfb6-1ffd-4c8e-8945-9d496467e46a\") " pod="openstack/placement-85fbcb99c8-4gdvt" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.906674 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26447265-57c1-45c6-bbef-cf7b2a82ed85-internal-tls-certs\") pod \"keystone-fffd48d8f-srcmr\" (UID: \"26447265-57c1-45c6-bbef-cf7b2a82ed85\") " pod="openstack/keystone-fffd48d8f-srcmr" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.907108 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af968080-3e37-4034-90a7-0b654e68ee89-logs\") pod \"glance-default-internal-api-0\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.908303 4734 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.915797 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af968080-3e37-4034-90a7-0b654e68ee89-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.915862 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af968080-3e37-4034-90a7-0b654e68ee89-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.920619 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af968080-3e37-4034-90a7-0b654e68ee89-config-data\") pod \"glance-default-internal-api-0\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.928241 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af968080-3e37-4034-90a7-0b654e68ee89-scripts\") pod \"glance-default-internal-api-0\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.932430 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd4fw\" (UniqueName: \"kubernetes.io/projected/af968080-3e37-4034-90a7-0b654e68ee89-kube-api-access-zd4fw\") pod \"glance-default-internal-api-0\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:48 crc kubenswrapper[4734]: I1205 23:38:48.951917 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.008955 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccbbcfb6-1ffd-4c8e-8945-9d496467e46a-combined-ca-bundle\") pod \"placement-85fbcb99c8-4gdvt\" (UID: \"ccbbcfb6-1ffd-4c8e-8945-9d496467e46a\") " pod="openstack/placement-85fbcb99c8-4gdvt" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.009046 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccbbcfb6-1ffd-4c8e-8945-9d496467e46a-config-data\") pod \"placement-85fbcb99c8-4gdvt\" (UID: \"ccbbcfb6-1ffd-4c8e-8945-9d496467e46a\") " pod="openstack/placement-85fbcb99c8-4gdvt" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.009069 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/26447265-57c1-45c6-bbef-cf7b2a82ed85-credential-keys\") pod \"keystone-fffd48d8f-srcmr\" (UID: \"26447265-57c1-45c6-bbef-cf7b2a82ed85\") " pod="openstack/keystone-fffd48d8f-srcmr" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.009097 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26447265-57c1-45c6-bbef-cf7b2a82ed85-config-data\") pod \"keystone-fffd48d8f-srcmr\" (UID: \"26447265-57c1-45c6-bbef-cf7b2a82ed85\") " pod="openstack/keystone-fffd48d8f-srcmr" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.009122 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26447265-57c1-45c6-bbef-cf7b2a82ed85-combined-ca-bundle\") pod \"keystone-fffd48d8f-srcmr\" (UID: \"26447265-57c1-45c6-bbef-cf7b2a82ed85\") " pod="openstack/keystone-fffd48d8f-srcmr" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.009141 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccbbcfb6-1ffd-4c8e-8945-9d496467e46a-logs\") pod \"placement-85fbcb99c8-4gdvt\" (UID: \"ccbbcfb6-1ffd-4c8e-8945-9d496467e46a\") " pod="openstack/placement-85fbcb99c8-4gdvt" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.009175 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26447265-57c1-45c6-bbef-cf7b2a82ed85-scripts\") pod \"keystone-fffd48d8f-srcmr\" (UID: \"26447265-57c1-45c6-bbef-cf7b2a82ed85\") " pod="openstack/keystone-fffd48d8f-srcmr" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.009200 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/26447265-57c1-45c6-bbef-cf7b2a82ed85-fernet-keys\") pod \"keystone-fffd48d8f-srcmr\" (UID: \"26447265-57c1-45c6-bbef-cf7b2a82ed85\") " pod="openstack/keystone-fffd48d8f-srcmr" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.009236 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcmj2\" (UniqueName: \"kubernetes.io/projected/26447265-57c1-45c6-bbef-cf7b2a82ed85-kube-api-access-xcmj2\") pod \"keystone-fffd48d8f-srcmr\" (UID: \"26447265-57c1-45c6-bbef-cf7b2a82ed85\") " pod="openstack/keystone-fffd48d8f-srcmr" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.009284 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvf8z\" (UniqueName: \"kubernetes.io/projected/ccbbcfb6-1ffd-4c8e-8945-9d496467e46a-kube-api-access-qvf8z\") pod \"placement-85fbcb99c8-4gdvt\" (UID: \"ccbbcfb6-1ffd-4c8e-8945-9d496467e46a\") " pod="openstack/placement-85fbcb99c8-4gdvt" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.009327 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccbbcfb6-1ffd-4c8e-8945-9d496467e46a-scripts\") pod \"placement-85fbcb99c8-4gdvt\" (UID: \"ccbbcfb6-1ffd-4c8e-8945-9d496467e46a\") " pod="openstack/placement-85fbcb99c8-4gdvt" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.009352 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccbbcfb6-1ffd-4c8e-8945-9d496467e46a-internal-tls-certs\") pod \"placement-85fbcb99c8-4gdvt\" (UID: \"ccbbcfb6-1ffd-4c8e-8945-9d496467e46a\") " pod="openstack/placement-85fbcb99c8-4gdvt" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.009371 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26447265-57c1-45c6-bbef-cf7b2a82ed85-internal-tls-certs\") pod \"keystone-fffd48d8f-srcmr\" (UID: \"26447265-57c1-45c6-bbef-cf7b2a82ed85\") " pod="openstack/keystone-fffd48d8f-srcmr" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.009410 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26447265-57c1-45c6-bbef-cf7b2a82ed85-public-tls-certs\") pod \"keystone-fffd48d8f-srcmr\" (UID: \"26447265-57c1-45c6-bbef-cf7b2a82ed85\") " pod="openstack/keystone-fffd48d8f-srcmr" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.009438 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccbbcfb6-1ffd-4c8e-8945-9d496467e46a-public-tls-certs\") pod \"placement-85fbcb99c8-4gdvt\" (UID: \"ccbbcfb6-1ffd-4c8e-8945-9d496467e46a\") " pod="openstack/placement-85fbcb99c8-4gdvt" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.010472 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccbbcfb6-1ffd-4c8e-8945-9d496467e46a-logs\") pod \"placement-85fbcb99c8-4gdvt\" (UID: \"ccbbcfb6-1ffd-4c8e-8945-9d496467e46a\") " pod="openstack/placement-85fbcb99c8-4gdvt" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.023485 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26447265-57c1-45c6-bbef-cf7b2a82ed85-combined-ca-bundle\") pod \"keystone-fffd48d8f-srcmr\" (UID: \"26447265-57c1-45c6-bbef-cf7b2a82ed85\") " pod="openstack/keystone-fffd48d8f-srcmr" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.024571 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/26447265-57c1-45c6-bbef-cf7b2a82ed85-fernet-keys\") pod \"keystone-fffd48d8f-srcmr\" (UID: \"26447265-57c1-45c6-bbef-cf7b2a82ed85\") " pod="openstack/keystone-fffd48d8f-srcmr" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.025423 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccbbcfb6-1ffd-4c8e-8945-9d496467e46a-internal-tls-certs\") pod \"placement-85fbcb99c8-4gdvt\" (UID: \"ccbbcfb6-1ffd-4c8e-8945-9d496467e46a\") " pod="openstack/placement-85fbcb99c8-4gdvt" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.025687 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26447265-57c1-45c6-bbef-cf7b2a82ed85-public-tls-certs\") pod \"keystone-fffd48d8f-srcmr\" (UID: \"26447265-57c1-45c6-bbef-cf7b2a82ed85\") " pod="openstack/keystone-fffd48d8f-srcmr" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.030145 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccbbcfb6-1ffd-4c8e-8945-9d496467e46a-config-data\") pod \"placement-85fbcb99c8-4gdvt\" (UID: \"ccbbcfb6-1ffd-4c8e-8945-9d496467e46a\") " pod="openstack/placement-85fbcb99c8-4gdvt" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.031302 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26447265-57c1-45c6-bbef-cf7b2a82ed85-internal-tls-certs\") pod \"keystone-fffd48d8f-srcmr\" (UID: \"26447265-57c1-45c6-bbef-cf7b2a82ed85\") " pod="openstack/keystone-fffd48d8f-srcmr" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.033787 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccbbcfb6-1ffd-4c8e-8945-9d496467e46a-public-tls-certs\") pod \"placement-85fbcb99c8-4gdvt\" (UID: \"ccbbcfb6-1ffd-4c8e-8945-9d496467e46a\") " pod="openstack/placement-85fbcb99c8-4gdvt" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.034702 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-654798d8cb-lq2fv" event={"ID":"6a6a790b-5626-41aa-994f-0c0740790a7d","Type":"ContainerStarted","Data":"efb38390a599c517100ebac471fbf6ae2aa043331ad9aa34737375b0e2c3b959"} Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.034764 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-654798d8cb-lq2fv" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.036586 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26447265-57c1-45c6-bbef-cf7b2a82ed85-config-data\") pod \"keystone-fffd48d8f-srcmr\" (UID: \"26447265-57c1-45c6-bbef-cf7b2a82ed85\") " pod="openstack/keystone-fffd48d8f-srcmr" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.036893 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26447265-57c1-45c6-bbef-cf7b2a82ed85-scripts\") pod \"keystone-fffd48d8f-srcmr\" (UID: \"26447265-57c1-45c6-bbef-cf7b2a82ed85\") " pod="openstack/keystone-fffd48d8f-srcmr" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.037403 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/26447265-57c1-45c6-bbef-cf7b2a82ed85-credential-keys\") pod \"keystone-fffd48d8f-srcmr\" (UID: \"26447265-57c1-45c6-bbef-cf7b2a82ed85\") " pod="openstack/keystone-fffd48d8f-srcmr" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.039733 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccbbcfb6-1ffd-4c8e-8945-9d496467e46a-scripts\") pod \"placement-85fbcb99c8-4gdvt\" (UID: \"ccbbcfb6-1ffd-4c8e-8945-9d496467e46a\") " pod="openstack/placement-85fbcb99c8-4gdvt" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.074358 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcmj2\" (UniqueName: \"kubernetes.io/projected/26447265-57c1-45c6-bbef-cf7b2a82ed85-kube-api-access-xcmj2\") pod \"keystone-fffd48d8f-srcmr\" (UID: \"26447265-57c1-45c6-bbef-cf7b2a82ed85\") " pod="openstack/keystone-fffd48d8f-srcmr" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.082277 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvf8z\" (UniqueName: \"kubernetes.io/projected/ccbbcfb6-1ffd-4c8e-8945-9d496467e46a-kube-api-access-qvf8z\") pod \"placement-85fbcb99c8-4gdvt\" (UID: \"ccbbcfb6-1ffd-4c8e-8945-9d496467e46a\") " pod="openstack/placement-85fbcb99c8-4gdvt" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.082766 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67d74d57d5-4s4p7" event={"ID":"f4201381-aab2-40da-9f4a-dc31e8874266","Type":"ContainerStarted","Data":"3ee0b14d707e439db8d92653d25980317b4088a4418c8693fcfacd3dea781bf6"} Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.118874 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-gxrch" event={"ID":"de3b13fb-9708-44af-bd09-f9be8514121e","Type":"ContainerStarted","Data":"bd86aa3a14b4c4ea3568fc35224329174b7da35ecc2b98b714bb50e2a2945b9f"} Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.120500 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-gxrch" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.134809 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccbbcfb6-1ffd-4c8e-8945-9d496467e46a-combined-ca-bundle\") pod \"placement-85fbcb99c8-4gdvt\" (UID: \"ccbbcfb6-1ffd-4c8e-8945-9d496467e46a\") " pod="openstack/placement-85fbcb99c8-4gdvt" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.150173 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fffd48d8f-srcmr" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.160361 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bde2d02b-758d-49cc-ba28-9501bbc7d0b0","Type":"ContainerDied","Data":"26fd39c3fc1619afa4e962e75d36bb1f470c64c22a3ea868e52ec6ebb8c3d440"} Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.160450 4734 scope.go:117] "RemoveContainer" containerID="bacff3262697fa635b03db0fd39e556f123493684c24719f3e7e77ac987df7c0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.160694 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.213647 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-85fbcb99c8-4gdvt" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.238216 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.238394 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"067f11aa-41d5-4a34-9f2e-33b35981e9ba","Type":"ContainerStarted","Data":"cd25d5410428568ea4619f8ae105032aee9215147914ec8375ef8d26a79e3039"} Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.352922 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-654798d8cb-lq2fv" podStartSLOduration=11.352882289 podStartE2EDuration="11.352882289s" podCreationTimestamp="2025-12-05 23:38:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:38:49.098554217 +0000 UTC m=+1149.781958493" watchObservedRunningTime="2025-12-05 23:38:49.352882289 +0000 UTC m=+1150.036286565" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.356963 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-gxrch" podStartSLOduration=11.356933167 podStartE2EDuration="11.356933167s" podCreationTimestamp="2025-12-05 23:38:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:38:49.234339427 +0000 UTC m=+1149.917743703" watchObservedRunningTime="2025-12-05 23:38:49.356933167 +0000 UTC m=+1150.040337443" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.368563 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.384827 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.416707 4734 scope.go:117] "RemoveContainer" containerID="6648e28a0aabf934a1e352939dbaefd515e610df9d551a2ba25b70f48f7d745c" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.444448 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.446387 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.458710 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.459767 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.530784 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-s4pbw" podUID="c0867534-4785-4d2c-9be9-74f3dfa9fb3c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.147:5353: i/o timeout" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.544508 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.585307 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq2zb\" (UniqueName: \"kubernetes.io/projected/05a4de5c-b10c-4d66-b3dd-0468357229b0-kube-api-access-bq2zb\") pod \"glance-default-external-api-0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.585904 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05a4de5c-b10c-4d66-b3dd-0468357229b0-logs\") pod \"glance-default-external-api-0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.586012 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a4de5c-b10c-4d66-b3dd-0468357229b0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.586116 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05a4de5c-b10c-4d66-b3dd-0468357229b0-scripts\") pod \"glance-default-external-api-0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.586221 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05a4de5c-b10c-4d66-b3dd-0468357229b0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.586306 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a4de5c-b10c-4d66-b3dd-0468357229b0-config-data\") pod \"glance-default-external-api-0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.586460 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05a4de5c-b10c-4d66-b3dd-0468357229b0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.586626 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.683289 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21a9837d-cc1d-4bf1-9b52-f9196880e367" path="/var/lib/kubelet/pods/21a9837d-cc1d-4bf1-9b52-f9196880e367/volumes" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.688768 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05a4de5c-b10c-4d66-b3dd-0468357229b0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.688816 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a4de5c-b10c-4d66-b3dd-0468357229b0-config-data\") pod \"glance-default-external-api-0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.690416 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05a4de5c-b10c-4d66-b3dd-0468357229b0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.692244 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05a4de5c-b10c-4d66-b3dd-0468357229b0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.692351 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.692478 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq2zb\" (UniqueName: \"kubernetes.io/projected/05a4de5c-b10c-4d66-b3dd-0468357229b0-kube-api-access-bq2zb\") pod \"glance-default-external-api-0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.692509 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05a4de5c-b10c-4d66-b3dd-0468357229b0-logs\") pod \"glance-default-external-api-0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.692609 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a4de5c-b10c-4d66-b3dd-0468357229b0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.692657 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05a4de5c-b10c-4d66-b3dd-0468357229b0-scripts\") pod \"glance-default-external-api-0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.698179 4734 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.701082 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05a4de5c-b10c-4d66-b3dd-0468357229b0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.705724 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05a4de5c-b10c-4d66-b3dd-0468357229b0-logs\") pod \"glance-default-external-api-0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.707998 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a4de5c-b10c-4d66-b3dd-0468357229b0-config-data\") pod \"glance-default-external-api-0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.716825 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bde2d02b-758d-49cc-ba28-9501bbc7d0b0" path="/var/lib/kubelet/pods/bde2d02b-758d-49cc-ba28-9501bbc7d0b0/volumes" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.722314 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0867534-4785-4d2c-9be9-74f3dfa9fb3c" path="/var/lib/kubelet/pods/c0867534-4785-4d2c-9be9-74f3dfa9fb3c/volumes" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.731489 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a4de5c-b10c-4d66-b3dd-0468357229b0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.733897 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05a4de5c-b10c-4d66-b3dd-0468357229b0-scripts\") pod \"glance-default-external-api-0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.755352 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq2zb\" (UniqueName: \"kubernetes.io/projected/05a4de5c-b10c-4d66-b3dd-0468357229b0-kube-api-access-bq2zb\") pod \"glance-default-external-api-0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:49 crc kubenswrapper[4734]: I1205 23:38:49.851451 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " pod="openstack/glance-default-external-api-0" Dec 05 23:38:50 crc kubenswrapper[4734]: I1205 23:38:50.100112 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 23:38:50 crc kubenswrapper[4734]: I1205 23:38:50.242547 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-85fbcb99c8-4gdvt"] Dec 05 23:38:50 crc kubenswrapper[4734]: W1205 23:38:50.282479 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccbbcfb6_1ffd_4c8e_8945_9d496467e46a.slice/crio-795815f28efcd2ce7e8d1decc255dbc4b90c2916765d28b452f6b7d1857859f9 WatchSource:0}: Error finding container 795815f28efcd2ce7e8d1decc255dbc4b90c2916765d28b452f6b7d1857859f9: Status 404 returned error can't find the container with id 795815f28efcd2ce7e8d1decc255dbc4b90c2916765d28b452f6b7d1857859f9 Dec 05 23:38:50 crc kubenswrapper[4734]: I1205 23:38:50.294877 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67d74d57d5-4s4p7" event={"ID":"f4201381-aab2-40da-9f4a-dc31e8874266","Type":"ContainerStarted","Data":"51a91097aace527b936894fd532601ead6ceca503b2c3a01d165f15c172dd3c5"} Dec 05 23:38:50 crc kubenswrapper[4734]: I1205 23:38:50.435740 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 23:38:50 crc kubenswrapper[4734]: I1205 23:38:50.449435 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:38:50 crc kubenswrapper[4734]: I1205 23:38:50.449601 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:38:50 crc kubenswrapper[4734]: I1205 23:38:50.473180 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-fffd48d8f-srcmr"] Dec 05 23:38:50 crc kubenswrapper[4734]: W1205 23:38:50.518769 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26447265_57c1_45c6_bbef_cf7b2a82ed85.slice/crio-e201ed6654ea0d653c2d42e5f3aa2b4ac1a8c4205c5c5011562222d5e273a055 WatchSource:0}: Error finding container e201ed6654ea0d653c2d42e5f3aa2b4ac1a8c4205c5c5011562222d5e273a055: Status 404 returned error can't find the container with id e201ed6654ea0d653c2d42e5f3aa2b4ac1a8c4205c5c5011562222d5e273a055 Dec 05 23:38:51 crc kubenswrapper[4734]: I1205 23:38:51.306952 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 23:38:51 crc kubenswrapper[4734]: I1205 23:38:51.389117 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-85fbcb99c8-4gdvt" event={"ID":"ccbbcfb6-1ffd-4c8e-8945-9d496467e46a","Type":"ContainerStarted","Data":"795815f28efcd2ce7e8d1decc255dbc4b90c2916765d28b452f6b7d1857859f9"} Dec 05 23:38:51 crc kubenswrapper[4734]: I1205 23:38:51.394717 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-f82cb" event={"ID":"d71f9558-c417-4cc7-934f-258f388cced2","Type":"ContainerStarted","Data":"ad3d670f413f88dcf8202b5fcb6c9d218e25ec1a2ba6fa4ebad056845b56179f"} Dec 05 23:38:51 crc kubenswrapper[4734]: I1205 23:38:51.397835 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fffd48d8f-srcmr" event={"ID":"26447265-57c1-45c6-bbef-cf7b2a82ed85","Type":"ContainerStarted","Data":"e201ed6654ea0d653c2d42e5f3aa2b4ac1a8c4205c5c5011562222d5e273a055"} Dec 05 23:38:51 crc kubenswrapper[4734]: I1205 23:38:51.402668 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"af968080-3e37-4034-90a7-0b654e68ee89","Type":"ContainerStarted","Data":"2e3140b54dbe92c6d8dc229652ca7c5076899456ce76f964515283f40ffe415a"} Dec 05 23:38:51 crc kubenswrapper[4734]: I1205 23:38:51.428170 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67d74d57d5-4s4p7" event={"ID":"f4201381-aab2-40da-9f4a-dc31e8874266","Type":"ContainerStarted","Data":"8c3f03c821df7075f93f31c1331a04c53182696fdbf90f581d9fd5394f4ceb9d"} Dec 05 23:38:51 crc kubenswrapper[4734]: I1205 23:38:51.429727 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-67d74d57d5-4s4p7" Dec 05 23:38:51 crc kubenswrapper[4734]: I1205 23:38:51.454632 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"05a4de5c-b10c-4d66-b3dd-0468357229b0","Type":"ContainerStarted","Data":"3a448d7b8af7253e02b0d41664aa248ec5d29f3d1038d90d6a30c0cdbe378b80"} Dec 05 23:38:51 crc kubenswrapper[4734]: I1205 23:38:51.470655 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-f82cb" podStartSLOduration=4.205754032 podStartE2EDuration="49.470629589s" podCreationTimestamp="2025-12-05 23:38:02 +0000 UTC" firstStartedPulling="2025-12-05 23:38:04.210788367 +0000 UTC m=+1104.894192643" lastFinishedPulling="2025-12-05 23:38:49.475663924 +0000 UTC m=+1150.159068200" observedRunningTime="2025-12-05 23:38:51.427762531 +0000 UTC m=+1152.111166817" watchObservedRunningTime="2025-12-05 23:38:51.470629589 +0000 UTC m=+1152.154033865" Dec 05 23:38:51 crc kubenswrapper[4734]: I1205 23:38:51.486893 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-67d74d57d5-4s4p7" podStartSLOduration=10.486867522 podStartE2EDuration="10.486867522s" podCreationTimestamp="2025-12-05 23:38:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:38:51.468118188 +0000 UTC m=+1152.151522464" watchObservedRunningTime="2025-12-05 23:38:51.486867522 +0000 UTC m=+1152.170271798" Dec 05 23:38:51 crc kubenswrapper[4734]: I1205 23:38:51.902857 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5d469948dd-n7t4x" podUID="c96cd173-4707-4edc-a92e-35db297082e2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Dec 05 23:38:52 crc kubenswrapper[4734]: I1205 23:38:52.010585 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-755fc898d8-dlnbz" podUID="bbcbbde9-55c9-48dc-866d-ab670775e9b3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Dec 05 23:38:52 crc kubenswrapper[4734]: I1205 23:38:52.482688 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-85fbcb99c8-4gdvt" event={"ID":"ccbbcfb6-1ffd-4c8e-8945-9d496467e46a","Type":"ContainerStarted","Data":"8dc585f182af5dce73039ed3e060a3dd42df50cd5b40e6cd96ff7f83d7e9afe0"} Dec 05 23:38:52 crc kubenswrapper[4734]: I1205 23:38:52.495245 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fffd48d8f-srcmr" event={"ID":"26447265-57c1-45c6-bbef-cf7b2a82ed85","Type":"ContainerStarted","Data":"ea74e462abd3faf0f2a7cd3d92414ee4cd2c21fa56f5af8fa50de2a174de38c4"} Dec 05 23:38:54 crc kubenswrapper[4734]: I1205 23:38:54.250879 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-gxrch" Dec 05 23:38:54 crc kubenswrapper[4734]: I1205 23:38:54.363084 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-x9hwd"] Dec 05 23:38:54 crc kubenswrapper[4734]: I1205 23:38:54.363426 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" podUID="a2261d63-5689-409a-8395-c652e5c2960e" containerName="dnsmasq-dns" containerID="cri-o://d8398c8e8e80acadc385209acc0873aba343061e9edc9f33a2e8dcc5787ccc78" gracePeriod=10 Dec 05 23:38:54 crc kubenswrapper[4734]: I1205 23:38:54.529938 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"05a4de5c-b10c-4d66-b3dd-0468357229b0","Type":"ContainerStarted","Data":"caa15c8be00b0cd026dbf85abad91cee75f3e43c982a4959a482b572e2817c84"} Dec 05 23:38:54 crc kubenswrapper[4734]: I1205 23:38:54.532104 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-85fbcb99c8-4gdvt" event={"ID":"ccbbcfb6-1ffd-4c8e-8945-9d496467e46a","Type":"ContainerStarted","Data":"65a51091ef17309ae4f801779a8c846e37d1fa11c40b3344214735816bfadd06"} Dec 05 23:38:54 crc kubenswrapper[4734]: I1205 23:38:54.534555 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"af968080-3e37-4034-90a7-0b654e68ee89","Type":"ContainerStarted","Data":"880b3909e293946d8333fcbf933f98a0d14d6594cac4b9529e4bbc96550c5578"} Dec 05 23:38:54 crc kubenswrapper[4734]: I1205 23:38:54.534608 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-fffd48d8f-srcmr" Dec 05 23:38:54 crc kubenswrapper[4734]: I1205 23:38:54.560677 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-fffd48d8f-srcmr" podStartSLOduration=6.560644705 podStartE2EDuration="6.560644705s" podCreationTimestamp="2025-12-05 23:38:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:38:54.554797704 +0000 UTC m=+1155.238201990" watchObservedRunningTime="2025-12-05 23:38:54.560644705 +0000 UTC m=+1155.244048981" Dec 05 23:38:55 crc kubenswrapper[4734]: I1205 23:38:55.548130 4734 generic.go:334] "Generic (PLEG): container finished" podID="a2261d63-5689-409a-8395-c652e5c2960e" containerID="d8398c8e8e80acadc385209acc0873aba343061e9edc9f33a2e8dcc5787ccc78" exitCode=0 Dec 05 23:38:55 crc kubenswrapper[4734]: I1205 23:38:55.548237 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" event={"ID":"a2261d63-5689-409a-8395-c652e5c2960e","Type":"ContainerDied","Data":"d8398c8e8e80acadc385209acc0873aba343061e9edc9f33a2e8dcc5787ccc78"} Dec 05 23:38:55 crc kubenswrapper[4734]: I1205 23:38:55.549119 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-85fbcb99c8-4gdvt" Dec 05 23:38:55 crc kubenswrapper[4734]: I1205 23:38:55.549183 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-85fbcb99c8-4gdvt" Dec 05 23:38:55 crc kubenswrapper[4734]: I1205 23:38:55.590585 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-85fbcb99c8-4gdvt" podStartSLOduration=7.590557259 podStartE2EDuration="7.590557259s" podCreationTimestamp="2025-12-05 23:38:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:38:55.575492774 +0000 UTC m=+1156.258897050" watchObservedRunningTime="2025-12-05 23:38:55.590557259 +0000 UTC m=+1156.273961535" Dec 05 23:38:55 crc kubenswrapper[4734]: I1205 23:38:55.971309 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" Dec 05 23:38:56 crc kubenswrapper[4734]: I1205 23:38:56.109767 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-ovsdbserver-nb\") pod \"a2261d63-5689-409a-8395-c652e5c2960e\" (UID: \"a2261d63-5689-409a-8395-c652e5c2960e\") " Dec 05 23:38:56 crc kubenswrapper[4734]: I1205 23:38:56.109847 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-ovsdbserver-sb\") pod \"a2261d63-5689-409a-8395-c652e5c2960e\" (UID: \"a2261d63-5689-409a-8395-c652e5c2960e\") " Dec 05 23:38:56 crc kubenswrapper[4734]: I1205 23:38:56.114629 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x55bq\" (UniqueName: \"kubernetes.io/projected/a2261d63-5689-409a-8395-c652e5c2960e-kube-api-access-x55bq\") pod \"a2261d63-5689-409a-8395-c652e5c2960e\" (UID: \"a2261d63-5689-409a-8395-c652e5c2960e\") " Dec 05 23:38:56 crc kubenswrapper[4734]: I1205 23:38:56.114669 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-dns-svc\") pod \"a2261d63-5689-409a-8395-c652e5c2960e\" (UID: \"a2261d63-5689-409a-8395-c652e5c2960e\") " Dec 05 23:38:56 crc kubenswrapper[4734]: I1205 23:38:56.114833 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-dns-swift-storage-0\") pod \"a2261d63-5689-409a-8395-c652e5c2960e\" (UID: \"a2261d63-5689-409a-8395-c652e5c2960e\") " Dec 05 23:38:56 crc kubenswrapper[4734]: I1205 23:38:56.114923 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-config\") pod \"a2261d63-5689-409a-8395-c652e5c2960e\" (UID: \"a2261d63-5689-409a-8395-c652e5c2960e\") " Dec 05 23:38:56 crc kubenswrapper[4734]: I1205 23:38:56.127789 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2261d63-5689-409a-8395-c652e5c2960e-kube-api-access-x55bq" (OuterVolumeSpecName: "kube-api-access-x55bq") pod "a2261d63-5689-409a-8395-c652e5c2960e" (UID: "a2261d63-5689-409a-8395-c652e5c2960e"). InnerVolumeSpecName "kube-api-access-x55bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:38:56 crc kubenswrapper[4734]: I1205 23:38:56.208184 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a2261d63-5689-409a-8395-c652e5c2960e" (UID: "a2261d63-5689-409a-8395-c652e5c2960e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:38:56 crc kubenswrapper[4734]: I1205 23:38:56.219933 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x55bq\" (UniqueName: \"kubernetes.io/projected/a2261d63-5689-409a-8395-c652e5c2960e-kube-api-access-x55bq\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:56 crc kubenswrapper[4734]: I1205 23:38:56.219983 4734 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:56 crc kubenswrapper[4734]: I1205 23:38:56.249692 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a2261d63-5689-409a-8395-c652e5c2960e" (UID: "a2261d63-5689-409a-8395-c652e5c2960e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:38:56 crc kubenswrapper[4734]: I1205 23:38:56.251493 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a2261d63-5689-409a-8395-c652e5c2960e" (UID: "a2261d63-5689-409a-8395-c652e5c2960e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:38:56 crc kubenswrapper[4734]: I1205 23:38:56.253766 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-config" (OuterVolumeSpecName: "config") pod "a2261d63-5689-409a-8395-c652e5c2960e" (UID: "a2261d63-5689-409a-8395-c652e5c2960e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:38:56 crc kubenswrapper[4734]: I1205 23:38:56.258116 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a2261d63-5689-409a-8395-c652e5c2960e" (UID: "a2261d63-5689-409a-8395-c652e5c2960e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:38:56 crc kubenswrapper[4734]: I1205 23:38:56.322373 4734 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:56 crc kubenswrapper[4734]: I1205 23:38:56.322415 4734 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:56 crc kubenswrapper[4734]: I1205 23:38:56.322427 4734 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:56 crc kubenswrapper[4734]: I1205 23:38:56.322443 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2261d63-5689-409a-8395-c652e5c2960e-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:38:56 crc kubenswrapper[4734]: I1205 23:38:56.565371 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" event={"ID":"a2261d63-5689-409a-8395-c652e5c2960e","Type":"ContainerDied","Data":"378d6dbac9bf0e69575ee3921a9f752a6216057b4d6de30acef4fcef53feae46"} Dec 05 23:38:56 crc kubenswrapper[4734]: I1205 23:38:56.565930 4734 scope.go:117] "RemoveContainer" containerID="d8398c8e8e80acadc385209acc0873aba343061e9edc9f33a2e8dcc5787ccc78" Dec 05 23:38:56 crc kubenswrapper[4734]: I1205 23:38:56.565512 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-x9hwd" Dec 05 23:38:56 crc kubenswrapper[4734]: I1205 23:38:56.568316 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"af968080-3e37-4034-90a7-0b654e68ee89","Type":"ContainerStarted","Data":"1b47e509e05bda36df057f4e42fdaa560d8dbd474320378bd221a11489d58c65"} Dec 05 23:38:56 crc kubenswrapper[4734]: I1205 23:38:56.611674 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.611653239 podStartE2EDuration="8.611653239s" podCreationTimestamp="2025-12-05 23:38:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:38:56.596307687 +0000 UTC m=+1157.279711983" watchObservedRunningTime="2025-12-05 23:38:56.611653239 +0000 UTC m=+1157.295057515" Dec 05 23:38:56 crc kubenswrapper[4734]: I1205 23:38:56.625215 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-x9hwd"] Dec 05 23:38:56 crc kubenswrapper[4734]: I1205 23:38:56.633390 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-x9hwd"] Dec 05 23:38:57 crc kubenswrapper[4734]: I1205 23:38:57.583749 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xsvx9" event={"ID":"4c26d17f-e341-41c5-9759-c0b265fcceea","Type":"ContainerStarted","Data":"1b18a6a4b18d08501789d9e21b925316c413cdbe7a4ed004b5fef4a11dfd69d1"} Dec 05 23:38:57 crc kubenswrapper[4734]: I1205 23:38:57.587378 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"05a4de5c-b10c-4d66-b3dd-0468357229b0","Type":"ContainerStarted","Data":"6fb7398bf31d93c15ed3a4e6788714ab58c7c73669d5c0a4145e85018ce10e05"} Dec 05 23:38:57 crc kubenswrapper[4734]: I1205 23:38:57.610132 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-xsvx9" podStartSLOduration=4.221271008 podStartE2EDuration="55.610104519s" podCreationTimestamp="2025-12-05 23:38:02 +0000 UTC" firstStartedPulling="2025-12-05 23:38:04.193062538 +0000 UTC m=+1104.876466814" lastFinishedPulling="2025-12-05 23:38:55.581896039 +0000 UTC m=+1156.265300325" observedRunningTime="2025-12-05 23:38:57.606589025 +0000 UTC m=+1158.289993301" watchObservedRunningTime="2025-12-05 23:38:57.610104519 +0000 UTC m=+1158.293508795" Dec 05 23:38:57 crc kubenswrapper[4734]: I1205 23:38:57.638306 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2261d63-5689-409a-8395-c652e5c2960e" path="/var/lib/kubelet/pods/a2261d63-5689-409a-8395-c652e5c2960e/volumes" Dec 05 23:38:57 crc kubenswrapper[4734]: I1205 23:38:57.640666 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.64064763 podStartE2EDuration="8.64064763s" podCreationTimestamp="2025-12-05 23:38:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:38:57.634820299 +0000 UTC m=+1158.318224575" watchObservedRunningTime="2025-12-05 23:38:57.64064763 +0000 UTC m=+1158.324051906" Dec 05 23:38:58 crc kubenswrapper[4734]: I1205 23:38:58.021040 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-85fbcb99c8-4gdvt" Dec 05 23:38:58 crc kubenswrapper[4734]: E1205 23:38:58.363324 4734 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21a9837d_cc1d_4bf1_9b52_f9196880e367.slice/crio-e3a126dc953224f7cf69d43ba28034a6b6151fba211d245d8ef374996a56a5fa.scope\": RecentStats: unable to find data in memory cache]" Dec 05 23:38:59 crc kubenswrapper[4734]: I1205 23:38:59.238859 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 23:38:59 crc kubenswrapper[4734]: I1205 23:38:59.239477 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 23:38:59 crc kubenswrapper[4734]: I1205 23:38:59.280247 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 23:38:59 crc kubenswrapper[4734]: I1205 23:38:59.315227 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 23:38:59 crc kubenswrapper[4734]: I1205 23:38:59.611313 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 23:38:59 crc kubenswrapper[4734]: I1205 23:38:59.611386 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 23:39:00 crc kubenswrapper[4734]: I1205 23:39:00.101066 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 23:39:00 crc kubenswrapper[4734]: I1205 23:39:00.101130 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 23:39:00 crc kubenswrapper[4734]: I1205 23:39:00.146497 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 23:39:00 crc kubenswrapper[4734]: I1205 23:39:00.153053 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 23:39:00 crc kubenswrapper[4734]: I1205 23:39:00.625047 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 23:39:00 crc kubenswrapper[4734]: I1205 23:39:00.625653 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 23:39:01 crc kubenswrapper[4734]: I1205 23:39:01.899105 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5d469948dd-n7t4x" podUID="c96cd173-4707-4edc-a92e-35db297082e2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Dec 05 23:39:02 crc kubenswrapper[4734]: I1205 23:39:02.007915 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-755fc898d8-dlnbz" podUID="bbcbbde9-55c9-48dc-866d-ab670775e9b3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Dec 05 23:39:02 crc kubenswrapper[4734]: I1205 23:39:02.085131 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 23:39:02 crc kubenswrapper[4734]: I1205 23:39:02.085273 4734 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 23:39:02 crc kubenswrapper[4734]: I1205 23:39:02.087349 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 23:39:03 crc kubenswrapper[4734]: I1205 23:39:03.977339 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 23:39:03 crc kubenswrapper[4734]: I1205 23:39:03.978032 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 23:39:05 crc kubenswrapper[4734]: I1205 23:39:05.708157 4734 generic.go:334] "Generic (PLEG): container finished" podID="d71f9558-c417-4cc7-934f-258f388cced2" containerID="ad3d670f413f88dcf8202b5fcb6c9d218e25ec1a2ba6fa4ebad056845b56179f" exitCode=0 Dec 05 23:39:05 crc kubenswrapper[4734]: I1205 23:39:05.708265 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-f82cb" event={"ID":"d71f9558-c417-4cc7-934f-258f388cced2","Type":"ContainerDied","Data":"ad3d670f413f88dcf8202b5fcb6c9d218e25ec1a2ba6fa4ebad056845b56179f"} Dec 05 23:39:06 crc kubenswrapper[4734]: I1205 23:39:06.151939 4734 scope.go:117] "RemoveContainer" containerID="d7c956a71ae0f76861b7425ac025c57aae06ff8d8157384091bb5e0ba0e2f96e" Dec 05 23:39:06 crc kubenswrapper[4734]: E1205 23:39:06.169450 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Dec 05 23:39:06 crc kubenswrapper[4734]: E1205 23:39:06.170090 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qs8cz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(067f11aa-41d5-4a34-9f2e-33b35981e9ba): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 23:39:06 crc kubenswrapper[4734]: E1205 23:39:06.171715 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"ceilometer-notification-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="067f11aa-41d5-4a34-9f2e-33b35981e9ba" Dec 05 23:39:06 crc kubenswrapper[4734]: I1205 23:39:06.731154 4734 generic.go:334] "Generic (PLEG): container finished" podID="5f9ab2cc-aaf2-46c4-b03b-c70d220732cb" containerID="1e9bd1d7fbce33ca96e48e6f2079fdd966a72795b3f096b43fc9a34d5663856c" exitCode=137 Dec 05 23:39:06 crc kubenswrapper[4734]: I1205 23:39:06.731193 4734 generic.go:334] "Generic (PLEG): container finished" podID="5f9ab2cc-aaf2-46c4-b03b-c70d220732cb" containerID="81aae8f5faee4191e15bf0466c0ce55e2137b33ca675df174b2ad82865c93da9" exitCode=137 Dec 05 23:39:06 crc kubenswrapper[4734]: I1205 23:39:06.731369 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="067f11aa-41d5-4a34-9f2e-33b35981e9ba" containerName="sg-core" containerID="cri-o://cd25d5410428568ea4619f8ae105032aee9215147914ec8375ef8d26a79e3039" gracePeriod=30 Dec 05 23:39:06 crc kubenswrapper[4734]: I1205 23:39:06.731991 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c55dfd787-8xfc2" event={"ID":"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb","Type":"ContainerDied","Data":"1e9bd1d7fbce33ca96e48e6f2079fdd966a72795b3f096b43fc9a34d5663856c"} Dec 05 23:39:06 crc kubenswrapper[4734]: I1205 23:39:06.732088 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c55dfd787-8xfc2" event={"ID":"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb","Type":"ContainerDied","Data":"81aae8f5faee4191e15bf0466c0ce55e2137b33ca675df174b2ad82865c93da9"} Dec 05 23:39:06 crc kubenswrapper[4734]: I1205 23:39:06.973294 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c55dfd787-8xfc2" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.061593 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-f82cb" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.079983 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-config-data\") pod \"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb\" (UID: \"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb\") " Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.080097 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-scripts\") pod \"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb\" (UID: \"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb\") " Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.080313 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-horizon-secret-key\") pod \"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb\" (UID: \"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb\") " Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.080425 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-logs\") pod \"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb\" (UID: \"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb\") " Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.080474 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vss5g\" (UniqueName: \"kubernetes.io/projected/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-kube-api-access-vss5g\") pod \"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb\" (UID: \"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb\") " Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.088500 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-logs" (OuterVolumeSpecName: "logs") pod "5f9ab2cc-aaf2-46c4-b03b-c70d220732cb" (UID: "5f9ab2cc-aaf2-46c4-b03b-c70d220732cb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.094690 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "5f9ab2cc-aaf2-46c4-b03b-c70d220732cb" (UID: "5f9ab2cc-aaf2-46c4-b03b-c70d220732cb"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.098945 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-kube-api-access-vss5g" (OuterVolumeSpecName: "kube-api-access-vss5g") pod "5f9ab2cc-aaf2-46c4-b03b-c70d220732cb" (UID: "5f9ab2cc-aaf2-46c4-b03b-c70d220732cb"). InnerVolumeSpecName "kube-api-access-vss5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:39:07 crc kubenswrapper[4734]: E1205 23:39:07.113750 4734 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-config-data podName:5f9ab2cc-aaf2-46c4-b03b-c70d220732cb nodeName:}" failed. No retries permitted until 2025-12-05 23:39:07.613706623 +0000 UTC m=+1168.297110899 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/configmap/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-config-data") pod "5f9ab2cc-aaf2-46c4-b03b-c70d220732cb" (UID: "5f9ab2cc-aaf2-46c4-b03b-c70d220732cb") : error deleting /var/lib/kubelet/pods/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb/volume-subpaths: remove /var/lib/kubelet/pods/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb/volume-subpaths: no such file or directory Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.114820 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-scripts" (OuterVolumeSpecName: "scripts") pod "5f9ab2cc-aaf2-46c4-b03b-c70d220732cb" (UID: "5f9ab2cc-aaf2-46c4-b03b-c70d220732cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.187609 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d71f9558-c417-4cc7-934f-258f388cced2-db-sync-config-data\") pod \"d71f9558-c417-4cc7-934f-258f388cced2\" (UID: \"d71f9558-c417-4cc7-934f-258f388cced2\") " Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.187741 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pv9bp\" (UniqueName: \"kubernetes.io/projected/d71f9558-c417-4cc7-934f-258f388cced2-kube-api-access-pv9bp\") pod \"d71f9558-c417-4cc7-934f-258f388cced2\" (UID: \"d71f9558-c417-4cc7-934f-258f388cced2\") " Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.187931 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71f9558-c417-4cc7-934f-258f388cced2-combined-ca-bundle\") pod \"d71f9558-c417-4cc7-934f-258f388cced2\" (UID: \"d71f9558-c417-4cc7-934f-258f388cced2\") " Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.188475 4734 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.188503 4734 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.188518 4734 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-logs\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.188535 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vss5g\" (UniqueName: \"kubernetes.io/projected/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-kube-api-access-vss5g\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.193768 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d71f9558-c417-4cc7-934f-258f388cced2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d71f9558-c417-4cc7-934f-258f388cced2" (UID: "d71f9558-c417-4cc7-934f-258f388cced2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.198378 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d71f9558-c417-4cc7-934f-258f388cced2-kube-api-access-pv9bp" (OuterVolumeSpecName: "kube-api-access-pv9bp") pod "d71f9558-c417-4cc7-934f-258f388cced2" (UID: "d71f9558-c417-4cc7-934f-258f388cced2"). InnerVolumeSpecName "kube-api-access-pv9bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.216133 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d71f9558-c417-4cc7-934f-258f388cced2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d71f9558-c417-4cc7-934f-258f388cced2" (UID: "d71f9558-c417-4cc7-934f-258f388cced2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.291147 4734 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d71f9558-c417-4cc7-934f-258f388cced2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.291177 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pv9bp\" (UniqueName: \"kubernetes.io/projected/d71f9558-c417-4cc7-934f-258f388cced2-kube-api-access-pv9bp\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.291190 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71f9558-c417-4cc7-934f-258f388cced2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.336988 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.494927 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/067f11aa-41d5-4a34-9f2e-33b35981e9ba-run-httpd\") pod \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\" (UID: \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\") " Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.494984 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/067f11aa-41d5-4a34-9f2e-33b35981e9ba-combined-ca-bundle\") pod \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\" (UID: \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\") " Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.495019 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/067f11aa-41d5-4a34-9f2e-33b35981e9ba-log-httpd\") pod \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\" (UID: \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\") " Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.495139 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/067f11aa-41d5-4a34-9f2e-33b35981e9ba-sg-core-conf-yaml\") pod \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\" (UID: \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\") " Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.495187 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/067f11aa-41d5-4a34-9f2e-33b35981e9ba-scripts\") pod \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\" (UID: \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\") " Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.495347 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs8cz\" (UniqueName: \"kubernetes.io/projected/067f11aa-41d5-4a34-9f2e-33b35981e9ba-kube-api-access-qs8cz\") pod \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\" (UID: \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\") " Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.495394 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/067f11aa-41d5-4a34-9f2e-33b35981e9ba-config-data\") pod \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\" (UID: \"067f11aa-41d5-4a34-9f2e-33b35981e9ba\") " Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.495409 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/067f11aa-41d5-4a34-9f2e-33b35981e9ba-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "067f11aa-41d5-4a34-9f2e-33b35981e9ba" (UID: "067f11aa-41d5-4a34-9f2e-33b35981e9ba"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.495548 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/067f11aa-41d5-4a34-9f2e-33b35981e9ba-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "067f11aa-41d5-4a34-9f2e-33b35981e9ba" (UID: "067f11aa-41d5-4a34-9f2e-33b35981e9ba"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.495903 4734 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/067f11aa-41d5-4a34-9f2e-33b35981e9ba-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.495926 4734 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/067f11aa-41d5-4a34-9f2e-33b35981e9ba-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.499788 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/067f11aa-41d5-4a34-9f2e-33b35981e9ba-kube-api-access-qs8cz" (OuterVolumeSpecName: "kube-api-access-qs8cz") pod "067f11aa-41d5-4a34-9f2e-33b35981e9ba" (UID: "067f11aa-41d5-4a34-9f2e-33b35981e9ba"). InnerVolumeSpecName "kube-api-access-qs8cz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.501822 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/067f11aa-41d5-4a34-9f2e-33b35981e9ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "067f11aa-41d5-4a34-9f2e-33b35981e9ba" (UID: "067f11aa-41d5-4a34-9f2e-33b35981e9ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.502090 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/067f11aa-41d5-4a34-9f2e-33b35981e9ba-config-data" (OuterVolumeSpecName: "config-data") pod "067f11aa-41d5-4a34-9f2e-33b35981e9ba" (UID: "067f11aa-41d5-4a34-9f2e-33b35981e9ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.503468 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/067f11aa-41d5-4a34-9f2e-33b35981e9ba-scripts" (OuterVolumeSpecName: "scripts") pod "067f11aa-41d5-4a34-9f2e-33b35981e9ba" (UID: "067f11aa-41d5-4a34-9f2e-33b35981e9ba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.528623 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/067f11aa-41d5-4a34-9f2e-33b35981e9ba-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "067f11aa-41d5-4a34-9f2e-33b35981e9ba" (UID: "067f11aa-41d5-4a34-9f2e-33b35981e9ba"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.598024 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs8cz\" (UniqueName: \"kubernetes.io/projected/067f11aa-41d5-4a34-9f2e-33b35981e9ba-kube-api-access-qs8cz\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.598112 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/067f11aa-41d5-4a34-9f2e-33b35981e9ba-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.598131 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/067f11aa-41d5-4a34-9f2e-33b35981e9ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.598142 4734 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/067f11aa-41d5-4a34-9f2e-33b35981e9ba-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.598154 4734 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/067f11aa-41d5-4a34-9f2e-33b35981e9ba-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.699492 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-config-data\") pod \"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb\" (UID: \"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb\") " Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.700601 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-config-data" (OuterVolumeSpecName: "config-data") pod "5f9ab2cc-aaf2-46c4-b03b-c70d220732cb" (UID: "5f9ab2cc-aaf2-46c4-b03b-c70d220732cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.701051 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.743647 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c55dfd787-8xfc2" event={"ID":"5f9ab2cc-aaf2-46c4-b03b-c70d220732cb","Type":"ContainerDied","Data":"50f36b899675afeb3c21ab167df5e282d91da6260923c9a33928cfd04b0d501d"} Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.743782 4734 scope.go:117] "RemoveContainer" containerID="1e9bd1d7fbce33ca96e48e6f2079fdd966a72795b3f096b43fc9a34d5663856c" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.743914 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c55dfd787-8xfc2" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.747916 4734 generic.go:334] "Generic (PLEG): container finished" podID="067f11aa-41d5-4a34-9f2e-33b35981e9ba" containerID="cd25d5410428568ea4619f8ae105032aee9215147914ec8375ef8d26a79e3039" exitCode=2 Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.748010 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"067f11aa-41d5-4a34-9f2e-33b35981e9ba","Type":"ContainerDied","Data":"cd25d5410428568ea4619f8ae105032aee9215147914ec8375ef8d26a79e3039"} Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.748063 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"067f11aa-41d5-4a34-9f2e-33b35981e9ba","Type":"ContainerDied","Data":"702d6a05c4f6e9c8105d0d8906c578fd4fc0499f2201ec6262e0ec44d5713d78"} Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.748076 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.757666 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-f82cb" event={"ID":"d71f9558-c417-4cc7-934f-258f388cced2","Type":"ContainerDied","Data":"97eb2b3c326f18eb0c102d9aa55d35921ea87086744685d6726ecadc78607af8"} Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.757725 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97eb2b3c326f18eb0c102d9aa55d35921ea87086744685d6726ecadc78607af8" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.757754 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-f82cb" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.823766 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.846990 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.865602 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:39:07 crc kubenswrapper[4734]: E1205 23:39:07.866226 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="067f11aa-41d5-4a34-9f2e-33b35981e9ba" containerName="sg-core" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.866257 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="067f11aa-41d5-4a34-9f2e-33b35981e9ba" containerName="sg-core" Dec 05 23:39:07 crc kubenswrapper[4734]: E1205 23:39:07.866300 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f9ab2cc-aaf2-46c4-b03b-c70d220732cb" containerName="horizon-log" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.866309 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f9ab2cc-aaf2-46c4-b03b-c70d220732cb" containerName="horizon-log" Dec 05 23:39:07 crc kubenswrapper[4734]: E1205 23:39:07.866340 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d71f9558-c417-4cc7-934f-258f388cced2" containerName="barbican-db-sync" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.866349 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="d71f9558-c417-4cc7-934f-258f388cced2" containerName="barbican-db-sync" Dec 05 23:39:07 crc kubenswrapper[4734]: E1205 23:39:07.866361 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2261d63-5689-409a-8395-c652e5c2960e" containerName="dnsmasq-dns" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.866370 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2261d63-5689-409a-8395-c652e5c2960e" containerName="dnsmasq-dns" Dec 05 23:39:07 crc kubenswrapper[4734]: E1205 23:39:07.866402 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f9ab2cc-aaf2-46c4-b03b-c70d220732cb" containerName="horizon" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.866411 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f9ab2cc-aaf2-46c4-b03b-c70d220732cb" containerName="horizon" Dec 05 23:39:07 crc kubenswrapper[4734]: E1205 23:39:07.866424 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2261d63-5689-409a-8395-c652e5c2960e" containerName="init" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.866432 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2261d63-5689-409a-8395-c652e5c2960e" containerName="init" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.866694 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="067f11aa-41d5-4a34-9f2e-33b35981e9ba" containerName="sg-core" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.866730 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2261d63-5689-409a-8395-c652e5c2960e" containerName="dnsmasq-dns" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.866748 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="d71f9558-c417-4cc7-934f-258f388cced2" containerName="barbican-db-sync" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.866762 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f9ab2cc-aaf2-46c4-b03b-c70d220732cb" containerName="horizon-log" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.866775 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f9ab2cc-aaf2-46c4-b03b-c70d220732cb" containerName="horizon" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.869074 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.874978 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.875974 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.887190 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-c55dfd787-8xfc2"] Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.906874 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-c55dfd787-8xfc2"] Dec 05 23:39:07 crc kubenswrapper[4734]: I1205 23:39:07.935212 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:07.998863 4734 scope.go:117] "RemoveContainer" containerID="81aae8f5faee4191e15bf0466c0ce55e2137b33ca675df174b2ad82865c93da9" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.030139 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe03fe87-03d0-45aa-a054-5e991c765ccc-scripts\") pod \"ceilometer-0\" (UID: \"fe03fe87-03d0-45aa-a054-5e991c765ccc\") " pod="openstack/ceilometer-0" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.030321 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe03fe87-03d0-45aa-a054-5e991c765ccc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe03fe87-03d0-45aa-a054-5e991c765ccc\") " pod="openstack/ceilometer-0" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.030366 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe03fe87-03d0-45aa-a054-5e991c765ccc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe03fe87-03d0-45aa-a054-5e991c765ccc\") " pod="openstack/ceilometer-0" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.030398 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq4t4\" (UniqueName: \"kubernetes.io/projected/fe03fe87-03d0-45aa-a054-5e991c765ccc-kube-api-access-rq4t4\") pod \"ceilometer-0\" (UID: \"fe03fe87-03d0-45aa-a054-5e991c765ccc\") " pod="openstack/ceilometer-0" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.030422 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe03fe87-03d0-45aa-a054-5e991c765ccc-run-httpd\") pod \"ceilometer-0\" (UID: \"fe03fe87-03d0-45aa-a054-5e991c765ccc\") " pod="openstack/ceilometer-0" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.030476 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe03fe87-03d0-45aa-a054-5e991c765ccc-config-data\") pod \"ceilometer-0\" (UID: \"fe03fe87-03d0-45aa-a054-5e991c765ccc\") " pod="openstack/ceilometer-0" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.030507 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe03fe87-03d0-45aa-a054-5e991c765ccc-log-httpd\") pod \"ceilometer-0\" (UID: \"fe03fe87-03d0-45aa-a054-5e991c765ccc\") " pod="openstack/ceilometer-0" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.041990 4734 scope.go:117] "RemoveContainer" containerID="cd25d5410428568ea4619f8ae105032aee9215147914ec8375ef8d26a79e3039" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.106469 4734 scope.go:117] "RemoveContainer" containerID="cd25d5410428568ea4619f8ae105032aee9215147914ec8375ef8d26a79e3039" Dec 05 23:39:08 crc kubenswrapper[4734]: E1205 23:39:08.108866 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd25d5410428568ea4619f8ae105032aee9215147914ec8375ef8d26a79e3039\": container with ID starting with cd25d5410428568ea4619f8ae105032aee9215147914ec8375ef8d26a79e3039 not found: ID does not exist" containerID="cd25d5410428568ea4619f8ae105032aee9215147914ec8375ef8d26a79e3039" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.108932 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd25d5410428568ea4619f8ae105032aee9215147914ec8375ef8d26a79e3039"} err="failed to get container status \"cd25d5410428568ea4619f8ae105032aee9215147914ec8375ef8d26a79e3039\": rpc error: code = NotFound desc = could not find container \"cd25d5410428568ea4619f8ae105032aee9215147914ec8375ef8d26a79e3039\": container with ID starting with cd25d5410428568ea4619f8ae105032aee9215147914ec8375ef8d26a79e3039 not found: ID does not exist" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.132261 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe03fe87-03d0-45aa-a054-5e991c765ccc-run-httpd\") pod \"ceilometer-0\" (UID: \"fe03fe87-03d0-45aa-a054-5e991c765ccc\") " pod="openstack/ceilometer-0" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.132341 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe03fe87-03d0-45aa-a054-5e991c765ccc-config-data\") pod \"ceilometer-0\" (UID: \"fe03fe87-03d0-45aa-a054-5e991c765ccc\") " pod="openstack/ceilometer-0" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.132366 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe03fe87-03d0-45aa-a054-5e991c765ccc-log-httpd\") pod \"ceilometer-0\" (UID: \"fe03fe87-03d0-45aa-a054-5e991c765ccc\") " pod="openstack/ceilometer-0" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.132411 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe03fe87-03d0-45aa-a054-5e991c765ccc-scripts\") pod \"ceilometer-0\" (UID: \"fe03fe87-03d0-45aa-a054-5e991c765ccc\") " pod="openstack/ceilometer-0" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.132494 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe03fe87-03d0-45aa-a054-5e991c765ccc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe03fe87-03d0-45aa-a054-5e991c765ccc\") " pod="openstack/ceilometer-0" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.132525 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe03fe87-03d0-45aa-a054-5e991c765ccc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe03fe87-03d0-45aa-a054-5e991c765ccc\") " pod="openstack/ceilometer-0" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.132573 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq4t4\" (UniqueName: \"kubernetes.io/projected/fe03fe87-03d0-45aa-a054-5e991c765ccc-kube-api-access-rq4t4\") pod \"ceilometer-0\" (UID: \"fe03fe87-03d0-45aa-a054-5e991c765ccc\") " pod="openstack/ceilometer-0" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.133718 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe03fe87-03d0-45aa-a054-5e991c765ccc-run-httpd\") pod \"ceilometer-0\" (UID: \"fe03fe87-03d0-45aa-a054-5e991c765ccc\") " pod="openstack/ceilometer-0" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.137654 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe03fe87-03d0-45aa-a054-5e991c765ccc-log-httpd\") pod \"ceilometer-0\" (UID: \"fe03fe87-03d0-45aa-a054-5e991c765ccc\") " pod="openstack/ceilometer-0" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.148212 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe03fe87-03d0-45aa-a054-5e991c765ccc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe03fe87-03d0-45aa-a054-5e991c765ccc\") " pod="openstack/ceilometer-0" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.153616 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe03fe87-03d0-45aa-a054-5e991c765ccc-scripts\") pod \"ceilometer-0\" (UID: \"fe03fe87-03d0-45aa-a054-5e991c765ccc\") " pod="openstack/ceilometer-0" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.154057 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe03fe87-03d0-45aa-a054-5e991c765ccc-config-data\") pod \"ceilometer-0\" (UID: \"fe03fe87-03d0-45aa-a054-5e991c765ccc\") " pod="openstack/ceilometer-0" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.157624 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-57c5555847-t5zf4"] Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.159930 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-57c5555847-t5zf4" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.165787 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe03fe87-03d0-45aa-a054-5e991c765ccc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe03fe87-03d0-45aa-a054-5e991c765ccc\") " pod="openstack/ceilometer-0" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.174305 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5d4f95c8c8-c5lws"] Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.176403 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d4f95c8c8-c5lws" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.196329 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-db8zc" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.196681 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.196930 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.197161 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.227705 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-57c5555847-t5zf4"] Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.237393 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0266e747-392d-46c1-bc3e-0ef614db01e3-config-data-custom\") pod \"barbican-keystone-listener-5d4f95c8c8-c5lws\" (UID: \"0266e747-392d-46c1-bc3e-0ef614db01e3\") " pod="openstack/barbican-keystone-listener-5d4f95c8c8-c5lws" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.237457 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0266e747-392d-46c1-bc3e-0ef614db01e3-logs\") pod \"barbican-keystone-listener-5d4f95c8c8-c5lws\" (UID: \"0266e747-392d-46c1-bc3e-0ef614db01e3\") " pod="openstack/barbican-keystone-listener-5d4f95c8c8-c5lws" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.237477 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nkzh\" (UniqueName: \"kubernetes.io/projected/b35b4bd8-efbd-4f96-9962-490ea41d44d1-kube-api-access-7nkzh\") pod \"barbican-worker-57c5555847-t5zf4\" (UID: \"b35b4bd8-efbd-4f96-9962-490ea41d44d1\") " pod="openstack/barbican-worker-57c5555847-t5zf4" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.237501 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcc2b\" (UniqueName: \"kubernetes.io/projected/0266e747-392d-46c1-bc3e-0ef614db01e3-kube-api-access-lcc2b\") pod \"barbican-keystone-listener-5d4f95c8c8-c5lws\" (UID: \"0266e747-392d-46c1-bc3e-0ef614db01e3\") " pod="openstack/barbican-keystone-listener-5d4f95c8c8-c5lws" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.237527 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0266e747-392d-46c1-bc3e-0ef614db01e3-combined-ca-bundle\") pod \"barbican-keystone-listener-5d4f95c8c8-c5lws\" (UID: \"0266e747-392d-46c1-bc3e-0ef614db01e3\") " pod="openstack/barbican-keystone-listener-5d4f95c8c8-c5lws" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.237575 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b35b4bd8-efbd-4f96-9962-490ea41d44d1-config-data\") pod \"barbican-worker-57c5555847-t5zf4\" (UID: \"b35b4bd8-efbd-4f96-9962-490ea41d44d1\") " pod="openstack/barbican-worker-57c5555847-t5zf4" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.237598 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b35b4bd8-efbd-4f96-9962-490ea41d44d1-combined-ca-bundle\") pod \"barbican-worker-57c5555847-t5zf4\" (UID: \"b35b4bd8-efbd-4f96-9962-490ea41d44d1\") " pod="openstack/barbican-worker-57c5555847-t5zf4" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.237624 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b35b4bd8-efbd-4f96-9962-490ea41d44d1-config-data-custom\") pod \"barbican-worker-57c5555847-t5zf4\" (UID: \"b35b4bd8-efbd-4f96-9962-490ea41d44d1\") " pod="openstack/barbican-worker-57c5555847-t5zf4" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.237686 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0266e747-392d-46c1-bc3e-0ef614db01e3-config-data\") pod \"barbican-keystone-listener-5d4f95c8c8-c5lws\" (UID: \"0266e747-392d-46c1-bc3e-0ef614db01e3\") " pod="openstack/barbican-keystone-listener-5d4f95c8c8-c5lws" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.237718 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b35b4bd8-efbd-4f96-9962-490ea41d44d1-logs\") pod \"barbican-worker-57c5555847-t5zf4\" (UID: \"b35b4bd8-efbd-4f96-9962-490ea41d44d1\") " pod="openstack/barbican-worker-57c5555847-t5zf4" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.239612 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq4t4\" (UniqueName: \"kubernetes.io/projected/fe03fe87-03d0-45aa-a054-5e991c765ccc-kube-api-access-rq4t4\") pod \"ceilometer-0\" (UID: \"fe03fe87-03d0-45aa-a054-5e991c765ccc\") " pod="openstack/ceilometer-0" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.253651 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5d4f95c8c8-c5lws"] Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.340878 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0266e747-392d-46c1-bc3e-0ef614db01e3-config-data\") pod \"barbican-keystone-listener-5d4f95c8c8-c5lws\" (UID: \"0266e747-392d-46c1-bc3e-0ef614db01e3\") " pod="openstack/barbican-keystone-listener-5d4f95c8c8-c5lws" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.340933 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b35b4bd8-efbd-4f96-9962-490ea41d44d1-logs\") pod \"barbican-worker-57c5555847-t5zf4\" (UID: \"b35b4bd8-efbd-4f96-9962-490ea41d44d1\") " pod="openstack/barbican-worker-57c5555847-t5zf4" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.340970 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0266e747-392d-46c1-bc3e-0ef614db01e3-config-data-custom\") pod \"barbican-keystone-listener-5d4f95c8c8-c5lws\" (UID: \"0266e747-392d-46c1-bc3e-0ef614db01e3\") " pod="openstack/barbican-keystone-listener-5d4f95c8c8-c5lws" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.341053 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0266e747-392d-46c1-bc3e-0ef614db01e3-logs\") pod \"barbican-keystone-listener-5d4f95c8c8-c5lws\" (UID: \"0266e747-392d-46c1-bc3e-0ef614db01e3\") " pod="openstack/barbican-keystone-listener-5d4f95c8c8-c5lws" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.341077 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nkzh\" (UniqueName: \"kubernetes.io/projected/b35b4bd8-efbd-4f96-9962-490ea41d44d1-kube-api-access-7nkzh\") pod \"barbican-worker-57c5555847-t5zf4\" (UID: \"b35b4bd8-efbd-4f96-9962-490ea41d44d1\") " pod="openstack/barbican-worker-57c5555847-t5zf4" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.341120 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcc2b\" (UniqueName: \"kubernetes.io/projected/0266e747-392d-46c1-bc3e-0ef614db01e3-kube-api-access-lcc2b\") pod \"barbican-keystone-listener-5d4f95c8c8-c5lws\" (UID: \"0266e747-392d-46c1-bc3e-0ef614db01e3\") " pod="openstack/barbican-keystone-listener-5d4f95c8c8-c5lws" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.345638 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0266e747-392d-46c1-bc3e-0ef614db01e3-combined-ca-bundle\") pod \"barbican-keystone-listener-5d4f95c8c8-c5lws\" (UID: \"0266e747-392d-46c1-bc3e-0ef614db01e3\") " pod="openstack/barbican-keystone-listener-5d4f95c8c8-c5lws" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.345713 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b35b4bd8-efbd-4f96-9962-490ea41d44d1-config-data\") pod \"barbican-worker-57c5555847-t5zf4\" (UID: \"b35b4bd8-efbd-4f96-9962-490ea41d44d1\") " pod="openstack/barbican-worker-57c5555847-t5zf4" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.345748 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b35b4bd8-efbd-4f96-9962-490ea41d44d1-combined-ca-bundle\") pod \"barbican-worker-57c5555847-t5zf4\" (UID: \"b35b4bd8-efbd-4f96-9962-490ea41d44d1\") " pod="openstack/barbican-worker-57c5555847-t5zf4" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.345806 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b35b4bd8-efbd-4f96-9962-490ea41d44d1-config-data-custom\") pod \"barbican-worker-57c5555847-t5zf4\" (UID: \"b35b4bd8-efbd-4f96-9962-490ea41d44d1\") " pod="openstack/barbican-worker-57c5555847-t5zf4" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.349226 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0266e747-392d-46c1-bc3e-0ef614db01e3-logs\") pod \"barbican-keystone-listener-5d4f95c8c8-c5lws\" (UID: \"0266e747-392d-46c1-bc3e-0ef614db01e3\") " pod="openstack/barbican-keystone-listener-5d4f95c8c8-c5lws" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.349817 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b35b4bd8-efbd-4f96-9962-490ea41d44d1-logs\") pod \"barbican-worker-57c5555847-t5zf4\" (UID: \"b35b4bd8-efbd-4f96-9962-490ea41d44d1\") " pod="openstack/barbican-worker-57c5555847-t5zf4" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.366441 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b35b4bd8-efbd-4f96-9962-490ea41d44d1-config-data\") pod \"barbican-worker-57c5555847-t5zf4\" (UID: \"b35b4bd8-efbd-4f96-9962-490ea41d44d1\") " pod="openstack/barbican-worker-57c5555847-t5zf4" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.367247 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-7ns6v"] Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.368130 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b35b4bd8-efbd-4f96-9962-490ea41d44d1-config-data-custom\") pod \"barbican-worker-57c5555847-t5zf4\" (UID: \"b35b4bd8-efbd-4f96-9962-490ea41d44d1\") " pod="openstack/barbican-worker-57c5555847-t5zf4" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.369468 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0266e747-392d-46c1-bc3e-0ef614db01e3-combined-ca-bundle\") pod \"barbican-keystone-listener-5d4f95c8c8-c5lws\" (UID: \"0266e747-392d-46c1-bc3e-0ef614db01e3\") " pod="openstack/barbican-keystone-listener-5d4f95c8c8-c5lws" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.369928 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0266e747-392d-46c1-bc3e-0ef614db01e3-config-data-custom\") pod \"barbican-keystone-listener-5d4f95c8c8-c5lws\" (UID: \"0266e747-392d-46c1-bc3e-0ef614db01e3\") " pod="openstack/barbican-keystone-listener-5d4f95c8c8-c5lws" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.374575 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-7ns6v" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.386439 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b35b4bd8-efbd-4f96-9962-490ea41d44d1-combined-ca-bundle\") pod \"barbican-worker-57c5555847-t5zf4\" (UID: \"b35b4bd8-efbd-4f96-9962-490ea41d44d1\") " pod="openstack/barbican-worker-57c5555847-t5zf4" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.387017 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0266e747-392d-46c1-bc3e-0ef614db01e3-config-data\") pod \"barbican-keystone-listener-5d4f95c8c8-c5lws\" (UID: \"0266e747-392d-46c1-bc3e-0ef614db01e3\") " pod="openstack/barbican-keystone-listener-5d4f95c8c8-c5lws" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.402569 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nkzh\" (UniqueName: \"kubernetes.io/projected/b35b4bd8-efbd-4f96-9962-490ea41d44d1-kube-api-access-7nkzh\") pod \"barbican-worker-57c5555847-t5zf4\" (UID: \"b35b4bd8-efbd-4f96-9962-490ea41d44d1\") " pod="openstack/barbican-worker-57c5555847-t5zf4" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.429536 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcc2b\" (UniqueName: \"kubernetes.io/projected/0266e747-392d-46c1-bc3e-0ef614db01e3-kube-api-access-lcc2b\") pod \"barbican-keystone-listener-5d4f95c8c8-c5lws\" (UID: \"0266e747-392d-46c1-bc3e-0ef614db01e3\") " pod="openstack/barbican-keystone-listener-5d4f95c8c8-c5lws" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.455339 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-7ns6v"] Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.461641 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-7ns6v\" (UID: \"0682ffd5-84fb-4e36-8386-f65aa88b6184\") " pod="openstack/dnsmasq-dns-85ff748b95-7ns6v" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.461774 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-7ns6v\" (UID: \"0682ffd5-84fb-4e36-8386-f65aa88b6184\") " pod="openstack/dnsmasq-dns-85ff748b95-7ns6v" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.461805 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-dns-svc\") pod \"dnsmasq-dns-85ff748b95-7ns6v\" (UID: \"0682ffd5-84fb-4e36-8386-f65aa88b6184\") " pod="openstack/dnsmasq-dns-85ff748b95-7ns6v" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.461929 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z94j5\" (UniqueName: \"kubernetes.io/projected/0682ffd5-84fb-4e36-8386-f65aa88b6184-kube-api-access-z94j5\") pod \"dnsmasq-dns-85ff748b95-7ns6v\" (UID: \"0682ffd5-84fb-4e36-8386-f65aa88b6184\") " pod="openstack/dnsmasq-dns-85ff748b95-7ns6v" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.462028 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-config\") pod \"dnsmasq-dns-85ff748b95-7ns6v\" (UID: \"0682ffd5-84fb-4e36-8386-f65aa88b6184\") " pod="openstack/dnsmasq-dns-85ff748b95-7ns6v" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.462377 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-7ns6v\" (UID: \"0682ffd5-84fb-4e36-8386-f65aa88b6184\") " pod="openstack/dnsmasq-dns-85ff748b95-7ns6v" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.515930 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.563672 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-75b7946cc8-hzcp7"] Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.565624 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75b7946cc8-hzcp7" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.566660 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-7ns6v\" (UID: \"0682ffd5-84fb-4e36-8386-f65aa88b6184\") " pod="openstack/dnsmasq-dns-85ff748b95-7ns6v" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.566753 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-7ns6v\" (UID: \"0682ffd5-84fb-4e36-8386-f65aa88b6184\") " pod="openstack/dnsmasq-dns-85ff748b95-7ns6v" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.566781 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-7ns6v\" (UID: \"0682ffd5-84fb-4e36-8386-f65aa88b6184\") " pod="openstack/dnsmasq-dns-85ff748b95-7ns6v" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.566804 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-dns-svc\") pod \"dnsmasq-dns-85ff748b95-7ns6v\" (UID: \"0682ffd5-84fb-4e36-8386-f65aa88b6184\") " pod="openstack/dnsmasq-dns-85ff748b95-7ns6v" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.566836 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z94j5\" (UniqueName: \"kubernetes.io/projected/0682ffd5-84fb-4e36-8386-f65aa88b6184-kube-api-access-z94j5\") pod \"dnsmasq-dns-85ff748b95-7ns6v\" (UID: \"0682ffd5-84fb-4e36-8386-f65aa88b6184\") " pod="openstack/dnsmasq-dns-85ff748b95-7ns6v" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.566865 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-config\") pod \"dnsmasq-dns-85ff748b95-7ns6v\" (UID: \"0682ffd5-84fb-4e36-8386-f65aa88b6184\") " pod="openstack/dnsmasq-dns-85ff748b95-7ns6v" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.567772 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-7ns6v\" (UID: \"0682ffd5-84fb-4e36-8386-f65aa88b6184\") " pod="openstack/dnsmasq-dns-85ff748b95-7ns6v" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.569044 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-config\") pod \"dnsmasq-dns-85ff748b95-7ns6v\" (UID: \"0682ffd5-84fb-4e36-8386-f65aa88b6184\") " pod="openstack/dnsmasq-dns-85ff748b95-7ns6v" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.572218 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-dns-svc\") pod \"dnsmasq-dns-85ff748b95-7ns6v\" (UID: \"0682ffd5-84fb-4e36-8386-f65aa88b6184\") " pod="openstack/dnsmasq-dns-85ff748b95-7ns6v" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.574149 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-7ns6v\" (UID: \"0682ffd5-84fb-4e36-8386-f65aa88b6184\") " pod="openstack/dnsmasq-dns-85ff748b95-7ns6v" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.576337 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.578051 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-7ns6v\" (UID: \"0682ffd5-84fb-4e36-8386-f65aa88b6184\") " pod="openstack/dnsmasq-dns-85ff748b95-7ns6v" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.596567 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-75b7946cc8-hzcp7"] Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.609863 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z94j5\" (UniqueName: \"kubernetes.io/projected/0682ffd5-84fb-4e36-8386-f65aa88b6184-kube-api-access-z94j5\") pod \"dnsmasq-dns-85ff748b95-7ns6v\" (UID: \"0682ffd5-84fb-4e36-8386-f65aa88b6184\") " pod="openstack/dnsmasq-dns-85ff748b95-7ns6v" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.613147 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-57c5555847-t5zf4" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.634482 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d4f95c8c8-c5lws" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.670110 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4093fb52-8433-4e30-9a08-9fc77fb5d49e-config-data\") pod \"barbican-api-75b7946cc8-hzcp7\" (UID: \"4093fb52-8433-4e30-9a08-9fc77fb5d49e\") " pod="openstack/barbican-api-75b7946cc8-hzcp7" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.670357 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfsth\" (UniqueName: \"kubernetes.io/projected/4093fb52-8433-4e30-9a08-9fc77fb5d49e-kube-api-access-kfsth\") pod \"barbican-api-75b7946cc8-hzcp7\" (UID: \"4093fb52-8433-4e30-9a08-9fc77fb5d49e\") " pod="openstack/barbican-api-75b7946cc8-hzcp7" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.670427 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4093fb52-8433-4e30-9a08-9fc77fb5d49e-logs\") pod \"barbican-api-75b7946cc8-hzcp7\" (UID: \"4093fb52-8433-4e30-9a08-9fc77fb5d49e\") " pod="openstack/barbican-api-75b7946cc8-hzcp7" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.670483 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4093fb52-8433-4e30-9a08-9fc77fb5d49e-combined-ca-bundle\") pod \"barbican-api-75b7946cc8-hzcp7\" (UID: \"4093fb52-8433-4e30-9a08-9fc77fb5d49e\") " pod="openstack/barbican-api-75b7946cc8-hzcp7" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.670510 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4093fb52-8433-4e30-9a08-9fc77fb5d49e-config-data-custom\") pod \"barbican-api-75b7946cc8-hzcp7\" (UID: \"4093fb52-8433-4e30-9a08-9fc77fb5d49e\") " pod="openstack/barbican-api-75b7946cc8-hzcp7" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.772905 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4093fb52-8433-4e30-9a08-9fc77fb5d49e-config-data\") pod \"barbican-api-75b7946cc8-hzcp7\" (UID: \"4093fb52-8433-4e30-9a08-9fc77fb5d49e\") " pod="openstack/barbican-api-75b7946cc8-hzcp7" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.773098 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfsth\" (UniqueName: \"kubernetes.io/projected/4093fb52-8433-4e30-9a08-9fc77fb5d49e-kube-api-access-kfsth\") pod \"barbican-api-75b7946cc8-hzcp7\" (UID: \"4093fb52-8433-4e30-9a08-9fc77fb5d49e\") " pod="openstack/barbican-api-75b7946cc8-hzcp7" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.773140 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4093fb52-8433-4e30-9a08-9fc77fb5d49e-logs\") pod \"barbican-api-75b7946cc8-hzcp7\" (UID: \"4093fb52-8433-4e30-9a08-9fc77fb5d49e\") " pod="openstack/barbican-api-75b7946cc8-hzcp7" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.773168 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4093fb52-8433-4e30-9a08-9fc77fb5d49e-combined-ca-bundle\") pod \"barbican-api-75b7946cc8-hzcp7\" (UID: \"4093fb52-8433-4e30-9a08-9fc77fb5d49e\") " pod="openstack/barbican-api-75b7946cc8-hzcp7" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.773193 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4093fb52-8433-4e30-9a08-9fc77fb5d49e-config-data-custom\") pod \"barbican-api-75b7946cc8-hzcp7\" (UID: \"4093fb52-8433-4e30-9a08-9fc77fb5d49e\") " pod="openstack/barbican-api-75b7946cc8-hzcp7" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.775484 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4093fb52-8433-4e30-9a08-9fc77fb5d49e-logs\") pod \"barbican-api-75b7946cc8-hzcp7\" (UID: \"4093fb52-8433-4e30-9a08-9fc77fb5d49e\") " pod="openstack/barbican-api-75b7946cc8-hzcp7" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.779821 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4093fb52-8433-4e30-9a08-9fc77fb5d49e-combined-ca-bundle\") pod \"barbican-api-75b7946cc8-hzcp7\" (UID: \"4093fb52-8433-4e30-9a08-9fc77fb5d49e\") " pod="openstack/barbican-api-75b7946cc8-hzcp7" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.780650 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4093fb52-8433-4e30-9a08-9fc77fb5d49e-config-data\") pod \"barbican-api-75b7946cc8-hzcp7\" (UID: \"4093fb52-8433-4e30-9a08-9fc77fb5d49e\") " pod="openstack/barbican-api-75b7946cc8-hzcp7" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.782637 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4093fb52-8433-4e30-9a08-9fc77fb5d49e-config-data-custom\") pod \"barbican-api-75b7946cc8-hzcp7\" (UID: \"4093fb52-8433-4e30-9a08-9fc77fb5d49e\") " pod="openstack/barbican-api-75b7946cc8-hzcp7" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.791148 4734 generic.go:334] "Generic (PLEG): container finished" podID="4c26d17f-e341-41c5-9759-c0b265fcceea" containerID="1b18a6a4b18d08501789d9e21b925316c413cdbe7a4ed004b5fef4a11dfd69d1" exitCode=0 Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.791260 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xsvx9" event={"ID":"4c26d17f-e341-41c5-9759-c0b265fcceea","Type":"ContainerDied","Data":"1b18a6a4b18d08501789d9e21b925316c413cdbe7a4ed004b5fef4a11dfd69d1"} Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.814692 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfsth\" (UniqueName: \"kubernetes.io/projected/4093fb52-8433-4e30-9a08-9fc77fb5d49e-kube-api-access-kfsth\") pod \"barbican-api-75b7946cc8-hzcp7\" (UID: \"4093fb52-8433-4e30-9a08-9fc77fb5d49e\") " pod="openstack/barbican-api-75b7946cc8-hzcp7" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.830886 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-7ns6v" Dec 05 23:39:08 crc kubenswrapper[4734]: E1205 23:39:08.837020 4734 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21a9837d_cc1d_4bf1_9b52_f9196880e367.slice/crio-e3a126dc953224f7cf69d43ba28034a6b6151fba211d245d8ef374996a56a5fa.scope\": RecentStats: unable to find data in memory cache]" Dec 05 23:39:08 crc kubenswrapper[4734]: I1205 23:39:08.922144 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75b7946cc8-hzcp7" Dec 05 23:39:09 crc kubenswrapper[4734]: I1205 23:39:09.153456 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:39:09 crc kubenswrapper[4734]: W1205 23:39:09.158672 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe03fe87_03d0_45aa_a054_5e991c765ccc.slice/crio-359b329f911140597764294baedc06b72d5d3a63e4ea959d255a3b1634106471 WatchSource:0}: Error finding container 359b329f911140597764294baedc06b72d5d3a63e4ea959d255a3b1634106471: Status 404 returned error can't find the container with id 359b329f911140597764294baedc06b72d5d3a63e4ea959d255a3b1634106471 Dec 05 23:39:09 crc kubenswrapper[4734]: W1205 23:39:09.275587 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb35b4bd8_efbd_4f96_9962_490ea41d44d1.slice/crio-a8fddb25f0e04316484dddbd2b8a1d3eea3f23a613ff86baeaf6d0ab34ab53aa WatchSource:0}: Error finding container a8fddb25f0e04316484dddbd2b8a1d3eea3f23a613ff86baeaf6d0ab34ab53aa: Status 404 returned error can't find the container with id a8fddb25f0e04316484dddbd2b8a1d3eea3f23a613ff86baeaf6d0ab34ab53aa Dec 05 23:39:09 crc kubenswrapper[4734]: I1205 23:39:09.275964 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-57c5555847-t5zf4"] Dec 05 23:39:09 crc kubenswrapper[4734]: I1205 23:39:09.400783 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5d4f95c8c8-c5lws"] Dec 05 23:39:09 crc kubenswrapper[4734]: I1205 23:39:09.418407 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-654798d8cb-lq2fv" Dec 05 23:39:09 crc kubenswrapper[4734]: I1205 23:39:09.493556 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-7ns6v"] Dec 05 23:39:09 crc kubenswrapper[4734]: I1205 23:39:09.598071 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-75b7946cc8-hzcp7"] Dec 05 23:39:09 crc kubenswrapper[4734]: I1205 23:39:09.652472 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="067f11aa-41d5-4a34-9f2e-33b35981e9ba" path="/var/lib/kubelet/pods/067f11aa-41d5-4a34-9f2e-33b35981e9ba/volumes" Dec 05 23:39:09 crc kubenswrapper[4734]: I1205 23:39:09.653342 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f9ab2cc-aaf2-46c4-b03b-c70d220732cb" path="/var/lib/kubelet/pods/5f9ab2cc-aaf2-46c4-b03b-c70d220732cb/volumes" Dec 05 23:39:09 crc kubenswrapper[4734]: I1205 23:39:09.839342 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-7ns6v" event={"ID":"0682ffd5-84fb-4e36-8386-f65aa88b6184","Type":"ContainerStarted","Data":"cbe860e9f094392ab945231b1d25910fc611f2dc9c7bb0eeff8e53ccb31fcd08"} Dec 05 23:39:09 crc kubenswrapper[4734]: I1205 23:39:09.842196 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-57c5555847-t5zf4" event={"ID":"b35b4bd8-efbd-4f96-9962-490ea41d44d1","Type":"ContainerStarted","Data":"a8fddb25f0e04316484dddbd2b8a1d3eea3f23a613ff86baeaf6d0ab34ab53aa"} Dec 05 23:39:09 crc kubenswrapper[4734]: I1205 23:39:09.843280 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe03fe87-03d0-45aa-a054-5e991c765ccc","Type":"ContainerStarted","Data":"359b329f911140597764294baedc06b72d5d3a63e4ea959d255a3b1634106471"} Dec 05 23:39:09 crc kubenswrapper[4734]: I1205 23:39:09.851830 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d4f95c8c8-c5lws" event={"ID":"0266e747-392d-46c1-bc3e-0ef614db01e3","Type":"ContainerStarted","Data":"36e109d938ecf81ec67b0b2368d8faa9e3a28cbfeca719279bd6af2287fc1d12"} Dec 05 23:39:09 crc kubenswrapper[4734]: I1205 23:39:09.854105 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75b7946cc8-hzcp7" event={"ID":"4093fb52-8433-4e30-9a08-9fc77fb5d49e","Type":"ContainerStarted","Data":"4fd3041ce1d84d198169382e76509c859b64138831cc3374cc9dd8d9be045393"} Dec 05 23:39:10 crc kubenswrapper[4734]: I1205 23:39:10.289243 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xsvx9" Dec 05 23:39:10 crc kubenswrapper[4734]: I1205 23:39:10.441899 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c26d17f-e341-41c5-9759-c0b265fcceea-scripts\") pod \"4c26d17f-e341-41c5-9759-c0b265fcceea\" (UID: \"4c26d17f-e341-41c5-9759-c0b265fcceea\") " Dec 05 23:39:10 crc kubenswrapper[4734]: I1205 23:39:10.442516 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c26d17f-e341-41c5-9759-c0b265fcceea-config-data\") pod \"4c26d17f-e341-41c5-9759-c0b265fcceea\" (UID: \"4c26d17f-e341-41c5-9759-c0b265fcceea\") " Dec 05 23:39:10 crc kubenswrapper[4734]: I1205 23:39:10.442612 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pf9fs\" (UniqueName: \"kubernetes.io/projected/4c26d17f-e341-41c5-9759-c0b265fcceea-kube-api-access-pf9fs\") pod \"4c26d17f-e341-41c5-9759-c0b265fcceea\" (UID: \"4c26d17f-e341-41c5-9759-c0b265fcceea\") " Dec 05 23:39:10 crc kubenswrapper[4734]: I1205 23:39:10.442710 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4c26d17f-e341-41c5-9759-c0b265fcceea-etc-machine-id\") pod \"4c26d17f-e341-41c5-9759-c0b265fcceea\" (UID: \"4c26d17f-e341-41c5-9759-c0b265fcceea\") " Dec 05 23:39:10 crc kubenswrapper[4734]: I1205 23:39:10.442816 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c26d17f-e341-41c5-9759-c0b265fcceea-combined-ca-bundle\") pod \"4c26d17f-e341-41c5-9759-c0b265fcceea\" (UID: \"4c26d17f-e341-41c5-9759-c0b265fcceea\") " Dec 05 23:39:10 crc kubenswrapper[4734]: I1205 23:39:10.442917 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4c26d17f-e341-41c5-9759-c0b265fcceea-db-sync-config-data\") pod \"4c26d17f-e341-41c5-9759-c0b265fcceea\" (UID: \"4c26d17f-e341-41c5-9759-c0b265fcceea\") " Dec 05 23:39:10 crc kubenswrapper[4734]: I1205 23:39:10.443033 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c26d17f-e341-41c5-9759-c0b265fcceea-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4c26d17f-e341-41c5-9759-c0b265fcceea" (UID: "4c26d17f-e341-41c5-9759-c0b265fcceea"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:39:10 crc kubenswrapper[4734]: I1205 23:39:10.443787 4734 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4c26d17f-e341-41c5-9759-c0b265fcceea-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:10 crc kubenswrapper[4734]: I1205 23:39:10.450000 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c26d17f-e341-41c5-9759-c0b265fcceea-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4c26d17f-e341-41c5-9759-c0b265fcceea" (UID: "4c26d17f-e341-41c5-9759-c0b265fcceea"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:10 crc kubenswrapper[4734]: I1205 23:39:10.451300 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c26d17f-e341-41c5-9759-c0b265fcceea-scripts" (OuterVolumeSpecName: "scripts") pod "4c26d17f-e341-41c5-9759-c0b265fcceea" (UID: "4c26d17f-e341-41c5-9759-c0b265fcceea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:10 crc kubenswrapper[4734]: I1205 23:39:10.463624 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c26d17f-e341-41c5-9759-c0b265fcceea-kube-api-access-pf9fs" (OuterVolumeSpecName: "kube-api-access-pf9fs") pod "4c26d17f-e341-41c5-9759-c0b265fcceea" (UID: "4c26d17f-e341-41c5-9759-c0b265fcceea"). InnerVolumeSpecName "kube-api-access-pf9fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:39:10 crc kubenswrapper[4734]: I1205 23:39:10.485280 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c26d17f-e341-41c5-9759-c0b265fcceea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c26d17f-e341-41c5-9759-c0b265fcceea" (UID: "4c26d17f-e341-41c5-9759-c0b265fcceea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:10 crc kubenswrapper[4734]: I1205 23:39:10.505194 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c26d17f-e341-41c5-9759-c0b265fcceea-config-data" (OuterVolumeSpecName: "config-data") pod "4c26d17f-e341-41c5-9759-c0b265fcceea" (UID: "4c26d17f-e341-41c5-9759-c0b265fcceea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:10 crc kubenswrapper[4734]: I1205 23:39:10.545618 4734 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4c26d17f-e341-41c5-9759-c0b265fcceea-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:10 crc kubenswrapper[4734]: I1205 23:39:10.545646 4734 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c26d17f-e341-41c5-9759-c0b265fcceea-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:10 crc kubenswrapper[4734]: I1205 23:39:10.545658 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c26d17f-e341-41c5-9759-c0b265fcceea-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:10 crc kubenswrapper[4734]: I1205 23:39:10.545670 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pf9fs\" (UniqueName: \"kubernetes.io/projected/4c26d17f-e341-41c5-9759-c0b265fcceea-kube-api-access-pf9fs\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:10 crc kubenswrapper[4734]: I1205 23:39:10.545685 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c26d17f-e341-41c5-9759-c0b265fcceea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:10 crc kubenswrapper[4734]: I1205 23:39:10.867480 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xsvx9" event={"ID":"4c26d17f-e341-41c5-9759-c0b265fcceea","Type":"ContainerDied","Data":"48d3586a98696ec0f0c4d981bd7f858c4ce00e5979652fad7b1e49dcc55d061e"} Dec 05 23:39:10 crc kubenswrapper[4734]: I1205 23:39:10.869220 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48d3586a98696ec0f0c4d981bd7f858c4ce00e5979652fad7b1e49dcc55d061e" Dec 05 23:39:10 crc kubenswrapper[4734]: I1205 23:39:10.867488 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xsvx9" Dec 05 23:39:10 crc kubenswrapper[4734]: I1205 23:39:10.869667 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-7ns6v" event={"ID":"0682ffd5-84fb-4e36-8386-f65aa88b6184","Type":"ContainerStarted","Data":"3c33eba90d0c6e7901e28a8f25e9d053ac547147f011915a857bb351822ad071"} Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.189090 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 23:39:11 crc kubenswrapper[4734]: E1205 23:39:11.190167 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c26d17f-e341-41c5-9759-c0b265fcceea" containerName="cinder-db-sync" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.190196 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c26d17f-e341-41c5-9759-c0b265fcceea" containerName="cinder-db-sync" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.190455 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c26d17f-e341-41c5-9759-c0b265fcceea" containerName="cinder-db-sync" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.191860 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.200371 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-rgfg5" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.201461 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.201612 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.201732 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.208022 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.258289 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-7ns6v"] Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.283893 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f773bf-b6a2-4f90-bf47-f3bd63431381-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.284009 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2f773bf-b6a2-4f90-bf47-f3bd63431381-scripts\") pod \"cinder-scheduler-0\" (UID: \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.284109 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96dl9\" (UniqueName: \"kubernetes.io/projected/e2f773bf-b6a2-4f90-bf47-f3bd63431381-kube-api-access-96dl9\") pod \"cinder-scheduler-0\" (UID: \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.284184 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f773bf-b6a2-4f90-bf47-f3bd63431381-config-data\") pod \"cinder-scheduler-0\" (UID: \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.284234 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2f773bf-b6a2-4f90-bf47-f3bd63431381-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.284275 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2f773bf-b6a2-4f90-bf47-f3bd63431381-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.325649 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-d6vvd"] Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.328410 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.368825 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-d6vvd"] Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.390023 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2f773bf-b6a2-4f90-bf47-f3bd63431381-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.390133 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f773bf-b6a2-4f90-bf47-f3bd63431381-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.390184 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2f773bf-b6a2-4f90-bf47-f3bd63431381-scripts\") pod \"cinder-scheduler-0\" (UID: \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.390238 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96dl9\" (UniqueName: \"kubernetes.io/projected/e2f773bf-b6a2-4f90-bf47-f3bd63431381-kube-api-access-96dl9\") pod \"cinder-scheduler-0\" (UID: \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.390276 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f773bf-b6a2-4f90-bf47-f3bd63431381-config-data\") pod \"cinder-scheduler-0\" (UID: \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.390299 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2f773bf-b6a2-4f90-bf47-f3bd63431381-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.393076 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2f773bf-b6a2-4f90-bf47-f3bd63431381-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.397211 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f773bf-b6a2-4f90-bf47-f3bd63431381-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.399707 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2f773bf-b6a2-4f90-bf47-f3bd63431381-scripts\") pod \"cinder-scheduler-0\" (UID: \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.403828 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f773bf-b6a2-4f90-bf47-f3bd63431381-config-data\") pod \"cinder-scheduler-0\" (UID: \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.423692 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96dl9\" (UniqueName: \"kubernetes.io/projected/e2f773bf-b6a2-4f90-bf47-f3bd63431381-kube-api-access-96dl9\") pod \"cinder-scheduler-0\" (UID: \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.427875 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2f773bf-b6a2-4f90-bf47-f3bd63431381-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.494728 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rkm6\" (UniqueName: \"kubernetes.io/projected/3850ca0d-4d1c-4b14-8633-f313cbb09401-kube-api-access-2rkm6\") pod \"dnsmasq-dns-5c9776ccc5-d6vvd\" (UID: \"3850ca0d-4d1c-4b14-8633-f313cbb09401\") " pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.494847 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-d6vvd\" (UID: \"3850ca0d-4d1c-4b14-8633-f313cbb09401\") " pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.494931 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-d6vvd\" (UID: \"3850ca0d-4d1c-4b14-8633-f313cbb09401\") " pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.494964 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-config\") pod \"dnsmasq-dns-5c9776ccc5-d6vvd\" (UID: \"3850ca0d-4d1c-4b14-8633-f313cbb09401\") " pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.495073 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-d6vvd\" (UID: \"3850ca0d-4d1c-4b14-8633-f313cbb09401\") " pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.495210 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-d6vvd\" (UID: \"3850ca0d-4d1c-4b14-8633-f313cbb09401\") " pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.546146 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.550853 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.555579 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.557632 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.577481 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.597430 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-d6vvd\" (UID: \"3850ca0d-4d1c-4b14-8633-f313cbb09401\") " pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.597526 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rkm6\" (UniqueName: \"kubernetes.io/projected/3850ca0d-4d1c-4b14-8633-f313cbb09401-kube-api-access-2rkm6\") pod \"dnsmasq-dns-5c9776ccc5-d6vvd\" (UID: \"3850ca0d-4d1c-4b14-8633-f313cbb09401\") " pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.597593 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-d6vvd\" (UID: \"3850ca0d-4d1c-4b14-8633-f313cbb09401\") " pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.597651 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-d6vvd\" (UID: \"3850ca0d-4d1c-4b14-8633-f313cbb09401\") " pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.597676 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-config\") pod \"dnsmasq-dns-5c9776ccc5-d6vvd\" (UID: \"3850ca0d-4d1c-4b14-8633-f313cbb09401\") " pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.597796 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-d6vvd\" (UID: \"3850ca0d-4d1c-4b14-8633-f313cbb09401\") " pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.598683 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-d6vvd\" (UID: \"3850ca0d-4d1c-4b14-8633-f313cbb09401\") " pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.598806 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-d6vvd\" (UID: \"3850ca0d-4d1c-4b14-8633-f313cbb09401\") " pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.599419 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-d6vvd\" (UID: \"3850ca0d-4d1c-4b14-8633-f313cbb09401\") " pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.599433 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-config\") pod \"dnsmasq-dns-5c9776ccc5-d6vvd\" (UID: \"3850ca0d-4d1c-4b14-8633-f313cbb09401\") " pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.600286 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-d6vvd\" (UID: \"3850ca0d-4d1c-4b14-8633-f313cbb09401\") " pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.622154 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rkm6\" (UniqueName: \"kubernetes.io/projected/3850ca0d-4d1c-4b14-8633-f313cbb09401-kube-api-access-2rkm6\") pod \"dnsmasq-dns-5c9776ccc5-d6vvd\" (UID: \"3850ca0d-4d1c-4b14-8633-f313cbb09401\") " pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.680214 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.701856 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e592729-e1ff-4707-8fd7-379eff2c5790-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6e592729-e1ff-4707-8fd7-379eff2c5790\") " pod="openstack/cinder-api-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.703110 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e592729-e1ff-4707-8fd7-379eff2c5790-logs\") pod \"cinder-api-0\" (UID: \"6e592729-e1ff-4707-8fd7-379eff2c5790\") " pod="openstack/cinder-api-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.703285 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e592729-e1ff-4707-8fd7-379eff2c5790-config-data-custom\") pod \"cinder-api-0\" (UID: \"6e592729-e1ff-4707-8fd7-379eff2c5790\") " pod="openstack/cinder-api-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.703403 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e592729-e1ff-4707-8fd7-379eff2c5790-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6e592729-e1ff-4707-8fd7-379eff2c5790\") " pod="openstack/cinder-api-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.703528 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2fcr\" (UniqueName: \"kubernetes.io/projected/6e592729-e1ff-4707-8fd7-379eff2c5790-kube-api-access-m2fcr\") pod \"cinder-api-0\" (UID: \"6e592729-e1ff-4707-8fd7-379eff2c5790\") " pod="openstack/cinder-api-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.703733 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e592729-e1ff-4707-8fd7-379eff2c5790-scripts\") pod \"cinder-api-0\" (UID: \"6e592729-e1ff-4707-8fd7-379eff2c5790\") " pod="openstack/cinder-api-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.703846 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e592729-e1ff-4707-8fd7-379eff2c5790-config-data\") pod \"cinder-api-0\" (UID: \"6e592729-e1ff-4707-8fd7-379eff2c5790\") " pod="openstack/cinder-api-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.807973 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e592729-e1ff-4707-8fd7-379eff2c5790-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6e592729-e1ff-4707-8fd7-379eff2c5790\") " pod="openstack/cinder-api-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.808134 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e592729-e1ff-4707-8fd7-379eff2c5790-logs\") pod \"cinder-api-0\" (UID: \"6e592729-e1ff-4707-8fd7-379eff2c5790\") " pod="openstack/cinder-api-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.808215 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e592729-e1ff-4707-8fd7-379eff2c5790-config-data-custom\") pod \"cinder-api-0\" (UID: \"6e592729-e1ff-4707-8fd7-379eff2c5790\") " pod="openstack/cinder-api-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.808271 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e592729-e1ff-4707-8fd7-379eff2c5790-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6e592729-e1ff-4707-8fd7-379eff2c5790\") " pod="openstack/cinder-api-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.808324 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2fcr\" (UniqueName: \"kubernetes.io/projected/6e592729-e1ff-4707-8fd7-379eff2c5790-kube-api-access-m2fcr\") pod \"cinder-api-0\" (UID: \"6e592729-e1ff-4707-8fd7-379eff2c5790\") " pod="openstack/cinder-api-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.808374 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e592729-e1ff-4707-8fd7-379eff2c5790-scripts\") pod \"cinder-api-0\" (UID: \"6e592729-e1ff-4707-8fd7-379eff2c5790\") " pod="openstack/cinder-api-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.808419 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e592729-e1ff-4707-8fd7-379eff2c5790-config-data\") pod \"cinder-api-0\" (UID: \"6e592729-e1ff-4707-8fd7-379eff2c5790\") " pod="openstack/cinder-api-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.808919 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e592729-e1ff-4707-8fd7-379eff2c5790-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6e592729-e1ff-4707-8fd7-379eff2c5790\") " pod="openstack/cinder-api-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.809383 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e592729-e1ff-4707-8fd7-379eff2c5790-logs\") pod \"cinder-api-0\" (UID: \"6e592729-e1ff-4707-8fd7-379eff2c5790\") " pod="openstack/cinder-api-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.849279 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2fcr\" (UniqueName: \"kubernetes.io/projected/6e592729-e1ff-4707-8fd7-379eff2c5790-kube-api-access-m2fcr\") pod \"cinder-api-0\" (UID: \"6e592729-e1ff-4707-8fd7-379eff2c5790\") " pod="openstack/cinder-api-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.869083 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e592729-e1ff-4707-8fd7-379eff2c5790-config-data-custom\") pod \"cinder-api-0\" (UID: \"6e592729-e1ff-4707-8fd7-379eff2c5790\") " pod="openstack/cinder-api-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.872259 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e592729-e1ff-4707-8fd7-379eff2c5790-config-data\") pod \"cinder-api-0\" (UID: \"6e592729-e1ff-4707-8fd7-379eff2c5790\") " pod="openstack/cinder-api-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.879102 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e592729-e1ff-4707-8fd7-379eff2c5790-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6e592729-e1ff-4707-8fd7-379eff2c5790\") " pod="openstack/cinder-api-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.906263 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e592729-e1ff-4707-8fd7-379eff2c5790-scripts\") pod \"cinder-api-0\" (UID: \"6e592729-e1ff-4707-8fd7-379eff2c5790\") " pod="openstack/cinder-api-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.912639 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.955170 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75b7946cc8-hzcp7" event={"ID":"4093fb52-8433-4e30-9a08-9fc77fb5d49e","Type":"ContainerStarted","Data":"130367aed39c766db24d0825b622516d6a6256618ee3738e9e1354f424f5a5a4"} Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.955241 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75b7946cc8-hzcp7" event={"ID":"4093fb52-8433-4e30-9a08-9fc77fb5d49e","Type":"ContainerStarted","Data":"bd514e4b2369b6ed51bbdc649f68e8ac1b180b1bf9b9530da65841f4c089923a"} Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.955442 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-75b7946cc8-hzcp7" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.955517 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-75b7946cc8-hzcp7" Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.976110 4734 generic.go:334] "Generic (PLEG): container finished" podID="0682ffd5-84fb-4e36-8386-f65aa88b6184" containerID="3c33eba90d0c6e7901e28a8f25e9d053ac547147f011915a857bb351822ad071" exitCode=0 Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.976217 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-7ns6v" event={"ID":"0682ffd5-84fb-4e36-8386-f65aa88b6184","Type":"ContainerDied","Data":"3c33eba90d0c6e7901e28a8f25e9d053ac547147f011915a857bb351822ad071"} Dec 05 23:39:11 crc kubenswrapper[4734]: I1205 23:39:11.987374 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-75b7946cc8-hzcp7" podStartSLOduration=3.9873382939999997 podStartE2EDuration="3.987338294s" podCreationTimestamp="2025-12-05 23:39:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:39:11.980025046 +0000 UTC m=+1172.663429322" watchObservedRunningTime="2025-12-05 23:39:11.987338294 +0000 UTC m=+1172.670742570" Dec 05 23:39:12 crc kubenswrapper[4734]: I1205 23:39:12.004868 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe03fe87-03d0-45aa-a054-5e991c765ccc","Type":"ContainerStarted","Data":"eb6ec676712aee3b9d54d8188fc659aaed58c25d07c5471140bac8f9492272a9"} Dec 05 23:39:12 crc kubenswrapper[4734]: I1205 23:39:12.293881 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-67d74d57d5-4s4p7" Dec 05 23:39:12 crc kubenswrapper[4734]: I1205 23:39:12.369372 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-d6vvd"] Dec 05 23:39:12 crc kubenswrapper[4734]: I1205 23:39:12.434585 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-654798d8cb-lq2fv"] Dec 05 23:39:12 crc kubenswrapper[4734]: I1205 23:39:12.435088 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-654798d8cb-lq2fv" podUID="6a6a790b-5626-41aa-994f-0c0740790a7d" containerName="neutron-httpd" containerID="cri-o://efb38390a599c517100ebac471fbf6ae2aa043331ad9aa34737375b0e2c3b959" gracePeriod=30 Dec 05 23:39:12 crc kubenswrapper[4734]: I1205 23:39:12.435781 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-654798d8cb-lq2fv" podUID="6a6a790b-5626-41aa-994f-0c0740790a7d" containerName="neutron-api" containerID="cri-o://2d13d89d313484f007afec1ce6d2ebd5001f4bd6907f6366fc11c7dec8e9b6b1" gracePeriod=30 Dec 05 23:39:12 crc kubenswrapper[4734]: I1205 23:39:12.457499 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 23:39:12 crc kubenswrapper[4734]: I1205 23:39:12.737844 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 23:39:13 crc kubenswrapper[4734]: I1205 23:39:13.023493 4734 generic.go:334] "Generic (PLEG): container finished" podID="6a6a790b-5626-41aa-994f-0c0740790a7d" containerID="efb38390a599c517100ebac471fbf6ae2aa043331ad9aa34737375b0e2c3b959" exitCode=0 Dec 05 23:39:13 crc kubenswrapper[4734]: I1205 23:39:13.023580 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-654798d8cb-lq2fv" event={"ID":"6a6a790b-5626-41aa-994f-0c0740790a7d","Type":"ContainerDied","Data":"efb38390a599c517100ebac471fbf6ae2aa043331ad9aa34737375b0e2c3b959"} Dec 05 23:39:13 crc kubenswrapper[4734]: I1205 23:39:13.026783 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe03fe87-03d0-45aa-a054-5e991c765ccc","Type":"ContainerStarted","Data":"bb470f46520236c5eab4dd981855f588aac943423952865089e1166f6e31a918"} Dec 05 23:39:13 crc kubenswrapper[4734]: E1205 23:39:13.668246 4734 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 05 23:39:13 crc kubenswrapper[4734]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/0682ffd5-84fb-4e36-8386-f65aa88b6184/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 05 23:39:13 crc kubenswrapper[4734]: > podSandboxID="cbe860e9f094392ab945231b1d25910fc611f2dc9c7bb0eeff8e53ccb31fcd08" Dec 05 23:39:13 crc kubenswrapper[4734]: E1205 23:39:13.668846 4734 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 05 23:39:13 crc kubenswrapper[4734]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n7ch57ch5c5hcch589hf7h577h659h96h5c8h5b4h55fhbbh667h565h5bchcbh58dh7dh5bch586h56ch574h598h67dh5c8h56dh8bh574h564hbch7q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z94j5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-85ff748b95-7ns6v_openstack(0682ffd5-84fb-4e36-8386-f65aa88b6184): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/0682ffd5-84fb-4e36-8386-f65aa88b6184/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 05 23:39:13 crc kubenswrapper[4734]: > logger="UnhandledError" Dec 05 23:39:13 crc kubenswrapper[4734]: E1205 23:39:13.670354 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/0682ffd5-84fb-4e36-8386-f65aa88b6184/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-85ff748b95-7ns6v" podUID="0682ffd5-84fb-4e36-8386-f65aa88b6184" Dec 05 23:39:14 crc kubenswrapper[4734]: I1205 23:39:14.046698 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e2f773bf-b6a2-4f90-bf47-f3bd63431381","Type":"ContainerStarted","Data":"f1b38848a33ae411fa49cda4e8d6f2b947c7bc712936ac9ab8ae6e8ea5e02cce"} Dec 05 23:39:14 crc kubenswrapper[4734]: I1205 23:39:14.049380 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6e592729-e1ff-4707-8fd7-379eff2c5790","Type":"ContainerStarted","Data":"bb1c2e3ca8f944e69f5f17b4ae76a107c07db0ce047e77284004e3153141ce5c"} Dec 05 23:39:14 crc kubenswrapper[4734]: I1205 23:39:14.053511 4734 generic.go:334] "Generic (PLEG): container finished" podID="3850ca0d-4d1c-4b14-8633-f313cbb09401" containerID="e63d7900d1c92f0bb1e3bd75a47703ac3221b5646756f298631cf385233503ec" exitCode=0 Dec 05 23:39:14 crc kubenswrapper[4734]: I1205 23:39:14.055274 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" event={"ID":"3850ca0d-4d1c-4b14-8633-f313cbb09401","Type":"ContainerDied","Data":"e63d7900d1c92f0bb1e3bd75a47703ac3221b5646756f298631cf385233503ec"} Dec 05 23:39:14 crc kubenswrapper[4734]: I1205 23:39:14.055388 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" event={"ID":"3850ca0d-4d1c-4b14-8633-f313cbb09401","Type":"ContainerStarted","Data":"5da559fb92548f22f6f171f488931e398bddde9a520ff67ee5563b0827a61b98"} Dec 05 23:39:14 crc kubenswrapper[4734]: I1205 23:39:14.504995 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-755fc898d8-dlnbz" Dec 05 23:39:14 crc kubenswrapper[4734]: I1205 23:39:14.670107 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:39:14 crc kubenswrapper[4734]: I1205 23:39:14.738468 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 23:39:14 crc kubenswrapper[4734]: I1205 23:39:14.836705 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-7ns6v" Dec 05 23:39:14 crc kubenswrapper[4734]: I1205 23:39:14.949300 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-ovsdbserver-nb\") pod \"0682ffd5-84fb-4e36-8386-f65aa88b6184\" (UID: \"0682ffd5-84fb-4e36-8386-f65aa88b6184\") " Dec 05 23:39:14 crc kubenswrapper[4734]: I1205 23:39:14.949395 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-ovsdbserver-sb\") pod \"0682ffd5-84fb-4e36-8386-f65aa88b6184\" (UID: \"0682ffd5-84fb-4e36-8386-f65aa88b6184\") " Dec 05 23:39:14 crc kubenswrapper[4734]: I1205 23:39:14.949528 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z94j5\" (UniqueName: \"kubernetes.io/projected/0682ffd5-84fb-4e36-8386-f65aa88b6184-kube-api-access-z94j5\") pod \"0682ffd5-84fb-4e36-8386-f65aa88b6184\" (UID: \"0682ffd5-84fb-4e36-8386-f65aa88b6184\") " Dec 05 23:39:14 crc kubenswrapper[4734]: I1205 23:39:14.949585 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-config\") pod \"0682ffd5-84fb-4e36-8386-f65aa88b6184\" (UID: \"0682ffd5-84fb-4e36-8386-f65aa88b6184\") " Dec 05 23:39:14 crc kubenswrapper[4734]: I1205 23:39:14.949650 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-dns-swift-storage-0\") pod \"0682ffd5-84fb-4e36-8386-f65aa88b6184\" (UID: \"0682ffd5-84fb-4e36-8386-f65aa88b6184\") " Dec 05 23:39:14 crc kubenswrapper[4734]: I1205 23:39:14.949735 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-dns-svc\") pod \"0682ffd5-84fb-4e36-8386-f65aa88b6184\" (UID: \"0682ffd5-84fb-4e36-8386-f65aa88b6184\") " Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.032827 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0682ffd5-84fb-4e36-8386-f65aa88b6184-kube-api-access-z94j5" (OuterVolumeSpecName: "kube-api-access-z94j5") pod "0682ffd5-84fb-4e36-8386-f65aa88b6184" (UID: "0682ffd5-84fb-4e36-8386-f65aa88b6184"). InnerVolumeSpecName "kube-api-access-z94j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.061663 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z94j5\" (UniqueName: \"kubernetes.io/projected/0682ffd5-84fb-4e36-8386-f65aa88b6184-kube-api-access-z94j5\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.075935 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-config" (OuterVolumeSpecName: "config") pod "0682ffd5-84fb-4e36-8386-f65aa88b6184" (UID: "0682ffd5-84fb-4e36-8386-f65aa88b6184"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.082083 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0682ffd5-84fb-4e36-8386-f65aa88b6184" (UID: "0682ffd5-84fb-4e36-8386-f65aa88b6184"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.097697 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe03fe87-03d0-45aa-a054-5e991c765ccc","Type":"ContainerStarted","Data":"b531e588ce7410be24b22419fc98c0ca55459c394dd280ee194826305e09d19c"} Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.103287 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d4f95c8c8-c5lws" event={"ID":"0266e747-392d-46c1-bc3e-0ef614db01e3","Type":"ContainerStarted","Data":"90c9cfaf163a97728ab29983aaf0792193c1a1223ed08d32d52c321d79193dae"} Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.112111 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-7ns6v" event={"ID":"0682ffd5-84fb-4e36-8386-f65aa88b6184","Type":"ContainerDied","Data":"cbe860e9f094392ab945231b1d25910fc611f2dc9c7bb0eeff8e53ccb31fcd08"} Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.112199 4734 scope.go:117] "RemoveContainer" containerID="3c33eba90d0c6e7901e28a8f25e9d053ac547147f011915a857bb351822ad071" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.112398 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-7ns6v" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.122005 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-57c5555847-t5zf4" event={"ID":"b35b4bd8-efbd-4f96-9962-490ea41d44d1","Type":"ContainerStarted","Data":"7ba8f36edc6d2f1cececec23431493ebddd9cd8fed58deeae2c4f7fa2d51e859"} Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.148751 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" event={"ID":"3850ca0d-4d1c-4b14-8633-f313cbb09401","Type":"ContainerStarted","Data":"ec05b4c35d592b979f2fc36dd2e8ef72fd5ba0b5bf2ea588bf3bb7c682b79057"} Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.150779 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.166683 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.166727 4734 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.179023 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" podStartSLOduration=4.178999846 podStartE2EDuration="4.178999846s" podCreationTimestamp="2025-12-05 23:39:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:39:15.174464246 +0000 UTC m=+1175.857868522" watchObservedRunningTime="2025-12-05 23:39:15.178999846 +0000 UTC m=+1175.862404122" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.300207 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0682ffd5-84fb-4e36-8386-f65aa88b6184" (UID: "0682ffd5-84fb-4e36-8386-f65aa88b6184"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.328260 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0682ffd5-84fb-4e36-8386-f65aa88b6184" (UID: "0682ffd5-84fb-4e36-8386-f65aa88b6184"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.339160 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0682ffd5-84fb-4e36-8386-f65aa88b6184" (UID: "0682ffd5-84fb-4e36-8386-f65aa88b6184"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.372906 4734 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.373091 4734 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.373175 4734 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0682ffd5-84fb-4e36-8386-f65aa88b6184-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.458185 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5574c9fdf8-q682b"] Dec 05 23:39:15 crc kubenswrapper[4734]: E1205 23:39:15.458757 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0682ffd5-84fb-4e36-8386-f65aa88b6184" containerName="init" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.458777 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="0682ffd5-84fb-4e36-8386-f65aa88b6184" containerName="init" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.458988 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="0682ffd5-84fb-4e36-8386-f65aa88b6184" containerName="init" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.460129 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5574c9fdf8-q682b" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.463558 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.463735 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.497402 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5574c9fdf8-q682b"] Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.579701 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv7jr\" (UniqueName: \"kubernetes.io/projected/d3c7aa3a-ca07-4476-8b39-06479afae42d-kube-api-access-wv7jr\") pod \"barbican-api-5574c9fdf8-q682b\" (UID: \"d3c7aa3a-ca07-4476-8b39-06479afae42d\") " pod="openstack/barbican-api-5574c9fdf8-q682b" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.579775 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3c7aa3a-ca07-4476-8b39-06479afae42d-config-data-custom\") pod \"barbican-api-5574c9fdf8-q682b\" (UID: \"d3c7aa3a-ca07-4476-8b39-06479afae42d\") " pod="openstack/barbican-api-5574c9fdf8-q682b" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.579797 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3c7aa3a-ca07-4476-8b39-06479afae42d-config-data\") pod \"barbican-api-5574c9fdf8-q682b\" (UID: \"d3c7aa3a-ca07-4476-8b39-06479afae42d\") " pod="openstack/barbican-api-5574c9fdf8-q682b" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.580259 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3c7aa3a-ca07-4476-8b39-06479afae42d-public-tls-certs\") pod \"barbican-api-5574c9fdf8-q682b\" (UID: \"d3c7aa3a-ca07-4476-8b39-06479afae42d\") " pod="openstack/barbican-api-5574c9fdf8-q682b" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.580468 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3c7aa3a-ca07-4476-8b39-06479afae42d-internal-tls-certs\") pod \"barbican-api-5574c9fdf8-q682b\" (UID: \"d3c7aa3a-ca07-4476-8b39-06479afae42d\") " pod="openstack/barbican-api-5574c9fdf8-q682b" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.580561 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3c7aa3a-ca07-4476-8b39-06479afae42d-combined-ca-bundle\") pod \"barbican-api-5574c9fdf8-q682b\" (UID: \"d3c7aa3a-ca07-4476-8b39-06479afae42d\") " pod="openstack/barbican-api-5574c9fdf8-q682b" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.580648 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3c7aa3a-ca07-4476-8b39-06479afae42d-logs\") pod \"barbican-api-5574c9fdf8-q682b\" (UID: \"d3c7aa3a-ca07-4476-8b39-06479afae42d\") " pod="openstack/barbican-api-5574c9fdf8-q682b" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.684087 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv7jr\" (UniqueName: \"kubernetes.io/projected/d3c7aa3a-ca07-4476-8b39-06479afae42d-kube-api-access-wv7jr\") pod \"barbican-api-5574c9fdf8-q682b\" (UID: \"d3c7aa3a-ca07-4476-8b39-06479afae42d\") " pod="openstack/barbican-api-5574c9fdf8-q682b" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.684159 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3c7aa3a-ca07-4476-8b39-06479afae42d-config-data-custom\") pod \"barbican-api-5574c9fdf8-q682b\" (UID: \"d3c7aa3a-ca07-4476-8b39-06479afae42d\") " pod="openstack/barbican-api-5574c9fdf8-q682b" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.684193 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3c7aa3a-ca07-4476-8b39-06479afae42d-config-data\") pod \"barbican-api-5574c9fdf8-q682b\" (UID: \"d3c7aa3a-ca07-4476-8b39-06479afae42d\") " pod="openstack/barbican-api-5574c9fdf8-q682b" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.684275 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3c7aa3a-ca07-4476-8b39-06479afae42d-public-tls-certs\") pod \"barbican-api-5574c9fdf8-q682b\" (UID: \"d3c7aa3a-ca07-4476-8b39-06479afae42d\") " pod="openstack/barbican-api-5574c9fdf8-q682b" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.684331 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3c7aa3a-ca07-4476-8b39-06479afae42d-internal-tls-certs\") pod \"barbican-api-5574c9fdf8-q682b\" (UID: \"d3c7aa3a-ca07-4476-8b39-06479afae42d\") " pod="openstack/barbican-api-5574c9fdf8-q682b" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.684356 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3c7aa3a-ca07-4476-8b39-06479afae42d-combined-ca-bundle\") pod \"barbican-api-5574c9fdf8-q682b\" (UID: \"d3c7aa3a-ca07-4476-8b39-06479afae42d\") " pod="openstack/barbican-api-5574c9fdf8-q682b" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.684389 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3c7aa3a-ca07-4476-8b39-06479afae42d-logs\") pod \"barbican-api-5574c9fdf8-q682b\" (UID: \"d3c7aa3a-ca07-4476-8b39-06479afae42d\") " pod="openstack/barbican-api-5574c9fdf8-q682b" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.685146 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3c7aa3a-ca07-4476-8b39-06479afae42d-logs\") pod \"barbican-api-5574c9fdf8-q682b\" (UID: \"d3c7aa3a-ca07-4476-8b39-06479afae42d\") " pod="openstack/barbican-api-5574c9fdf8-q682b" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.692730 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3c7aa3a-ca07-4476-8b39-06479afae42d-config-data-custom\") pod \"barbican-api-5574c9fdf8-q682b\" (UID: \"d3c7aa3a-ca07-4476-8b39-06479afae42d\") " pod="openstack/barbican-api-5574c9fdf8-q682b" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.695065 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3c7aa3a-ca07-4476-8b39-06479afae42d-internal-tls-certs\") pod \"barbican-api-5574c9fdf8-q682b\" (UID: \"d3c7aa3a-ca07-4476-8b39-06479afae42d\") " pod="openstack/barbican-api-5574c9fdf8-q682b" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.696256 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3c7aa3a-ca07-4476-8b39-06479afae42d-config-data\") pod \"barbican-api-5574c9fdf8-q682b\" (UID: \"d3c7aa3a-ca07-4476-8b39-06479afae42d\") " pod="openstack/barbican-api-5574c9fdf8-q682b" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.705705 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-7ns6v"] Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.710166 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3c7aa3a-ca07-4476-8b39-06479afae42d-combined-ca-bundle\") pod \"barbican-api-5574c9fdf8-q682b\" (UID: \"d3c7aa3a-ca07-4476-8b39-06479afae42d\") " pod="openstack/barbican-api-5574c9fdf8-q682b" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.710407 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3c7aa3a-ca07-4476-8b39-06479afae42d-public-tls-certs\") pod \"barbican-api-5574c9fdf8-q682b\" (UID: \"d3c7aa3a-ca07-4476-8b39-06479afae42d\") " pod="openstack/barbican-api-5574c9fdf8-q682b" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.719679 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv7jr\" (UniqueName: \"kubernetes.io/projected/d3c7aa3a-ca07-4476-8b39-06479afae42d-kube-api-access-wv7jr\") pod \"barbican-api-5574c9fdf8-q682b\" (UID: \"d3c7aa3a-ca07-4476-8b39-06479afae42d\") " pod="openstack/barbican-api-5574c9fdf8-q682b" Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.727751 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-7ns6v"] Dec 05 23:39:15 crc kubenswrapper[4734]: I1205 23:39:15.799199 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5574c9fdf8-q682b" Dec 05 23:39:16 crc kubenswrapper[4734]: I1205 23:39:16.195752 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d4f95c8c8-c5lws" event={"ID":"0266e747-392d-46c1-bc3e-0ef614db01e3","Type":"ContainerStarted","Data":"0b22d84d275ea09cc89905f0986d2fe09f1cfb24191d7e082bada599af75e829"} Dec 05 23:39:16 crc kubenswrapper[4734]: I1205 23:39:16.235732 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5d4f95c8c8-c5lws" podStartSLOduration=3.844070362 podStartE2EDuration="8.235707383s" podCreationTimestamp="2025-12-05 23:39:08 +0000 UTC" firstStartedPulling="2025-12-05 23:39:09.392491363 +0000 UTC m=+1170.075895639" lastFinishedPulling="2025-12-05 23:39:13.784128384 +0000 UTC m=+1174.467532660" observedRunningTime="2025-12-05 23:39:16.231116842 +0000 UTC m=+1176.914521108" watchObservedRunningTime="2025-12-05 23:39:16.235707383 +0000 UTC m=+1176.919111659" Dec 05 23:39:16 crc kubenswrapper[4734]: I1205 23:39:16.259637 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e2f773bf-b6a2-4f90-bf47-f3bd63431381","Type":"ContainerStarted","Data":"fe4d8d710b2e9360a934579a2cdd9ae758e44c010f3a43db871e05efcb6663a9"} Dec 05 23:39:16 crc kubenswrapper[4734]: I1205 23:39:16.296088 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6e592729-e1ff-4707-8fd7-379eff2c5790","Type":"ContainerStarted","Data":"e4e7c0296a369512765c2d12a0791812c6be741da4f9ebe4953a2b1d67331f32"} Dec 05 23:39:16 crc kubenswrapper[4734]: I1205 23:39:16.346924 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-57c5555847-t5zf4" event={"ID":"b35b4bd8-efbd-4f96-9962-490ea41d44d1","Type":"ContainerStarted","Data":"ee90dfa33da2c2e60f48c5f70431862e307f53073a93d1ca3f5d8ece568cb0c3"} Dec 05 23:39:16 crc kubenswrapper[4734]: I1205 23:39:16.348104 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5574c9fdf8-q682b"] Dec 05 23:39:16 crc kubenswrapper[4734]: I1205 23:39:16.395958 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-57c5555847-t5zf4" podStartSLOduration=3.891885001 podStartE2EDuration="8.395925975s" podCreationTimestamp="2025-12-05 23:39:08 +0000 UTC" firstStartedPulling="2025-12-05 23:39:09.28628234 +0000 UTC m=+1169.969686606" lastFinishedPulling="2025-12-05 23:39:13.790323314 +0000 UTC m=+1174.473727580" observedRunningTime="2025-12-05 23:39:16.377664512 +0000 UTC m=+1177.061068788" watchObservedRunningTime="2025-12-05 23:39:16.395925975 +0000 UTC m=+1177.079330251" Dec 05 23:39:17 crc kubenswrapper[4734]: I1205 23:39:17.418374 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5574c9fdf8-q682b" event={"ID":"d3c7aa3a-ca07-4476-8b39-06479afae42d","Type":"ContainerStarted","Data":"a8ab276d0b4c58d90e9cc8a4005f5540ed8a53c6fa54e6b7e7729961c8d1ba5c"} Dec 05 23:39:17 crc kubenswrapper[4734]: I1205 23:39:17.419208 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5574c9fdf8-q682b" event={"ID":"d3c7aa3a-ca07-4476-8b39-06479afae42d","Type":"ContainerStarted","Data":"aaf4a7d7b564a0bd438b09b007f92dfe7428d5cf124d5df5ddf3e776a507477a"} Dec 05 23:39:17 crc kubenswrapper[4734]: I1205 23:39:17.438058 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe03fe87-03d0-45aa-a054-5e991c765ccc","Type":"ContainerStarted","Data":"8eb3419c100e18d602d17e80460f5aa327255183e72309669de08e8b7228eb7b"} Dec 05 23:39:17 crc kubenswrapper[4734]: I1205 23:39:17.444670 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 23:39:17 crc kubenswrapper[4734]: I1205 23:39:17.474633 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6e592729-e1ff-4707-8fd7-379eff2c5790","Type":"ContainerStarted","Data":"4efbe28e57748020b3de40562cccc01604426334d908232d354b84b890c4cb1f"} Dec 05 23:39:17 crc kubenswrapper[4734]: I1205 23:39:17.474965 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6e592729-e1ff-4707-8fd7-379eff2c5790" containerName="cinder-api-log" containerID="cri-o://e4e7c0296a369512765c2d12a0791812c6be741da4f9ebe4953a2b1d67331f32" gracePeriod=30 Dec 05 23:39:17 crc kubenswrapper[4734]: I1205 23:39:17.475103 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 05 23:39:17 crc kubenswrapper[4734]: I1205 23:39:17.475149 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6e592729-e1ff-4707-8fd7-379eff2c5790" containerName="cinder-api" containerID="cri-o://4efbe28e57748020b3de40562cccc01604426334d908232d354b84b890c4cb1f" gracePeriod=30 Dec 05 23:39:17 crc kubenswrapper[4734]: I1205 23:39:17.505086 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.498786272 podStartE2EDuration="10.505054942s" podCreationTimestamp="2025-12-05 23:39:07 +0000 UTC" firstStartedPulling="2025-12-05 23:39:09.161954177 +0000 UTC m=+1169.845358453" lastFinishedPulling="2025-12-05 23:39:16.168222857 +0000 UTC m=+1176.851627123" observedRunningTime="2025-12-05 23:39:17.495963272 +0000 UTC m=+1178.179367538" watchObservedRunningTime="2025-12-05 23:39:17.505054942 +0000 UTC m=+1178.188459218" Dec 05 23:39:17 crc kubenswrapper[4734]: I1205 23:39:17.554824 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.554782197 podStartE2EDuration="6.554782197s" podCreationTimestamp="2025-12-05 23:39:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:39:17.520044535 +0000 UTC m=+1178.203448811" watchObservedRunningTime="2025-12-05 23:39:17.554782197 +0000 UTC m=+1178.238186483" Dec 05 23:39:17 crc kubenswrapper[4734]: I1205 23:39:17.637042 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0682ffd5-84fb-4e36-8386-f65aa88b6184" path="/var/lib/kubelet/pods/0682ffd5-84fb-4e36-8386-f65aa88b6184/volumes" Dec 05 23:39:17 crc kubenswrapper[4734]: I1205 23:39:17.893577 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-755fc898d8-dlnbz" Dec 05 23:39:17 crc kubenswrapper[4734]: I1205 23:39:17.970267 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.010092 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5d469948dd-n7t4x"] Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.510876 4734 generic.go:334] "Generic (PLEG): container finished" podID="6a6a790b-5626-41aa-994f-0c0740790a7d" containerID="2d13d89d313484f007afec1ce6d2ebd5001f4bd6907f6366fc11c7dec8e9b6b1" exitCode=0 Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.511393 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-654798d8cb-lq2fv" event={"ID":"6a6a790b-5626-41aa-994f-0c0740790a7d","Type":"ContainerDied","Data":"2d13d89d313484f007afec1ce6d2ebd5001f4bd6907f6366fc11c7dec8e9b6b1"} Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.534321 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e2f773bf-b6a2-4f90-bf47-f3bd63431381","Type":"ContainerStarted","Data":"113ed975ba333f7b50bd27ac66d62f813eea9cac077c729a734a72647d330de8"} Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.559076 4734 generic.go:334] "Generic (PLEG): container finished" podID="6e592729-e1ff-4707-8fd7-379eff2c5790" containerID="4efbe28e57748020b3de40562cccc01604426334d908232d354b84b890c4cb1f" exitCode=0 Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.559130 4734 generic.go:334] "Generic (PLEG): container finished" podID="6e592729-e1ff-4707-8fd7-379eff2c5790" containerID="e4e7c0296a369512765c2d12a0791812c6be741da4f9ebe4953a2b1d67331f32" exitCode=143 Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.559235 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6e592729-e1ff-4707-8fd7-379eff2c5790","Type":"ContainerDied","Data":"4efbe28e57748020b3de40562cccc01604426334d908232d354b84b890c4cb1f"} Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.559277 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6e592729-e1ff-4707-8fd7-379eff2c5790","Type":"ContainerDied","Data":"e4e7c0296a369512765c2d12a0791812c6be741da4f9ebe4953a2b1d67331f32"} Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.583568 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5d469948dd-n7t4x" podUID="c96cd173-4707-4edc-a92e-35db297082e2" containerName="horizon-log" containerID="cri-o://8415f4e36ba22f55e16f34c1af4d5e303eef0e3ab5c7ab16635ff6313372655d" gracePeriod=30 Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.586634 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5574c9fdf8-q682b" event={"ID":"d3c7aa3a-ca07-4476-8b39-06479afae42d","Type":"ContainerStarted","Data":"c751ffc91293459429e30c8169c8e89c1bb0ff6c5619b41c9e18c161fd2f9cf6"} Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.586858 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5d469948dd-n7t4x" podUID="c96cd173-4707-4edc-a92e-35db297082e2" containerName="horizon" containerID="cri-o://9514774e87490121e03a56ce448d088925b7977bc46c6a8cf0102e7f1954b7d8" gracePeriod=30 Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.586924 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5574c9fdf8-q682b" Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.586970 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5574c9fdf8-q682b" Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.625128 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.678105606 podStartE2EDuration="7.625094443s" podCreationTimestamp="2025-12-05 23:39:11 +0000 UTC" firstStartedPulling="2025-12-05 23:39:13.201824954 +0000 UTC m=+1173.885229230" lastFinishedPulling="2025-12-05 23:39:14.148813791 +0000 UTC m=+1174.832218067" observedRunningTime="2025-12-05 23:39:18.56800338 +0000 UTC m=+1179.251407656" watchObservedRunningTime="2025-12-05 23:39:18.625094443 +0000 UTC m=+1179.308498719" Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.640992 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5574c9fdf8-q682b" podStartSLOduration=3.640963798 podStartE2EDuration="3.640963798s" podCreationTimestamp="2025-12-05 23:39:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:39:18.62042931 +0000 UTC m=+1179.303833586" watchObservedRunningTime="2025-12-05 23:39:18.640963798 +0000 UTC m=+1179.324368074" Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.772408 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.783377 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-654798d8cb-lq2fv" Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.926593 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a6a790b-5626-41aa-994f-0c0740790a7d-combined-ca-bundle\") pod \"6a6a790b-5626-41aa-994f-0c0740790a7d\" (UID: \"6a6a790b-5626-41aa-994f-0c0740790a7d\") " Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.926644 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvhk8\" (UniqueName: \"kubernetes.io/projected/6a6a790b-5626-41aa-994f-0c0740790a7d-kube-api-access-kvhk8\") pod \"6a6a790b-5626-41aa-994f-0c0740790a7d\" (UID: \"6a6a790b-5626-41aa-994f-0c0740790a7d\") " Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.926764 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e592729-e1ff-4707-8fd7-379eff2c5790-config-data-custom\") pod \"6e592729-e1ff-4707-8fd7-379eff2c5790\" (UID: \"6e592729-e1ff-4707-8fd7-379eff2c5790\") " Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.926799 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a6a790b-5626-41aa-994f-0c0740790a7d-ovndb-tls-certs\") pod \"6a6a790b-5626-41aa-994f-0c0740790a7d\" (UID: \"6a6a790b-5626-41aa-994f-0c0740790a7d\") " Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.926836 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e592729-e1ff-4707-8fd7-379eff2c5790-scripts\") pod \"6e592729-e1ff-4707-8fd7-379eff2c5790\" (UID: \"6e592729-e1ff-4707-8fd7-379eff2c5790\") " Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.926865 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e592729-e1ff-4707-8fd7-379eff2c5790-combined-ca-bundle\") pod \"6e592729-e1ff-4707-8fd7-379eff2c5790\" (UID: \"6e592729-e1ff-4707-8fd7-379eff2c5790\") " Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.926980 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2fcr\" (UniqueName: \"kubernetes.io/projected/6e592729-e1ff-4707-8fd7-379eff2c5790-kube-api-access-m2fcr\") pod \"6e592729-e1ff-4707-8fd7-379eff2c5790\" (UID: \"6e592729-e1ff-4707-8fd7-379eff2c5790\") " Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.927029 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6a6a790b-5626-41aa-994f-0c0740790a7d-config\") pod \"6a6a790b-5626-41aa-994f-0c0740790a7d\" (UID: \"6a6a790b-5626-41aa-994f-0c0740790a7d\") " Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.927063 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e592729-e1ff-4707-8fd7-379eff2c5790-logs\") pod \"6e592729-e1ff-4707-8fd7-379eff2c5790\" (UID: \"6e592729-e1ff-4707-8fd7-379eff2c5790\") " Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.927108 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6a6a790b-5626-41aa-994f-0c0740790a7d-httpd-config\") pod \"6a6a790b-5626-41aa-994f-0c0740790a7d\" (UID: \"6a6a790b-5626-41aa-994f-0c0740790a7d\") " Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.927141 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e592729-e1ff-4707-8fd7-379eff2c5790-config-data\") pod \"6e592729-e1ff-4707-8fd7-379eff2c5790\" (UID: \"6e592729-e1ff-4707-8fd7-379eff2c5790\") " Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.927189 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e592729-e1ff-4707-8fd7-379eff2c5790-etc-machine-id\") pod \"6e592729-e1ff-4707-8fd7-379eff2c5790\" (UID: \"6e592729-e1ff-4707-8fd7-379eff2c5790\") " Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.927759 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e592729-e1ff-4707-8fd7-379eff2c5790-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6e592729-e1ff-4707-8fd7-379eff2c5790" (UID: "6e592729-e1ff-4707-8fd7-379eff2c5790"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.941734 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e592729-e1ff-4707-8fd7-379eff2c5790-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6e592729-e1ff-4707-8fd7-379eff2c5790" (UID: "6e592729-e1ff-4707-8fd7-379eff2c5790"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.944664 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a6a790b-5626-41aa-994f-0c0740790a7d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "6a6a790b-5626-41aa-994f-0c0740790a7d" (UID: "6a6a790b-5626-41aa-994f-0c0740790a7d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.959796 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e592729-e1ff-4707-8fd7-379eff2c5790-logs" (OuterVolumeSpecName: "logs") pod "6e592729-e1ff-4707-8fd7-379eff2c5790" (UID: "6e592729-e1ff-4707-8fd7-379eff2c5790"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.966862 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e592729-e1ff-4707-8fd7-379eff2c5790-kube-api-access-m2fcr" (OuterVolumeSpecName: "kube-api-access-m2fcr") pod "6e592729-e1ff-4707-8fd7-379eff2c5790" (UID: "6e592729-e1ff-4707-8fd7-379eff2c5790"). InnerVolumeSpecName "kube-api-access-m2fcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.971780 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a6a790b-5626-41aa-994f-0c0740790a7d-kube-api-access-kvhk8" (OuterVolumeSpecName: "kube-api-access-kvhk8") pod "6a6a790b-5626-41aa-994f-0c0740790a7d" (UID: "6a6a790b-5626-41aa-994f-0c0740790a7d"). InnerVolumeSpecName "kube-api-access-kvhk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:39:18 crc kubenswrapper[4734]: I1205 23:39:18.988873 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e592729-e1ff-4707-8fd7-379eff2c5790-scripts" (OuterVolumeSpecName: "scripts") pod "6e592729-e1ff-4707-8fd7-379eff2c5790" (UID: "6e592729-e1ff-4707-8fd7-379eff2c5790"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.029681 4734 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e592729-e1ff-4707-8fd7-379eff2c5790-logs\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.029741 4734 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6a6a790b-5626-41aa-994f-0c0740790a7d-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.029756 4734 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e592729-e1ff-4707-8fd7-379eff2c5790-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.029794 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvhk8\" (UniqueName: \"kubernetes.io/projected/6a6a790b-5626-41aa-994f-0c0740790a7d-kube-api-access-kvhk8\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.029810 4734 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e592729-e1ff-4707-8fd7-379eff2c5790-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.029819 4734 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e592729-e1ff-4707-8fd7-379eff2c5790-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.029829 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2fcr\" (UniqueName: \"kubernetes.io/projected/6e592729-e1ff-4707-8fd7-379eff2c5790-kube-api-access-m2fcr\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.033022 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e592729-e1ff-4707-8fd7-379eff2c5790-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e592729-e1ff-4707-8fd7-379eff2c5790" (UID: "6e592729-e1ff-4707-8fd7-379eff2c5790"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.054419 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a6a790b-5626-41aa-994f-0c0740790a7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a6a790b-5626-41aa-994f-0c0740790a7d" (UID: "6a6a790b-5626-41aa-994f-0c0740790a7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.093900 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a6a790b-5626-41aa-994f-0c0740790a7d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "6a6a790b-5626-41aa-994f-0c0740790a7d" (UID: "6a6a790b-5626-41aa-994f-0c0740790a7d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.123624 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e592729-e1ff-4707-8fd7-379eff2c5790-config-data" (OuterVolumeSpecName: "config-data") pod "6e592729-e1ff-4707-8fd7-379eff2c5790" (UID: "6e592729-e1ff-4707-8fd7-379eff2c5790"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.125714 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a6a790b-5626-41aa-994f-0c0740790a7d-config" (OuterVolumeSpecName: "config") pod "6a6a790b-5626-41aa-994f-0c0740790a7d" (UID: "6a6a790b-5626-41aa-994f-0c0740790a7d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.133075 4734 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a6a790b-5626-41aa-994f-0c0740790a7d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.133129 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e592729-e1ff-4707-8fd7-379eff2c5790-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.133147 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6a6a790b-5626-41aa-994f-0c0740790a7d-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.133161 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e592729-e1ff-4707-8fd7-379eff2c5790-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.133173 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a6a790b-5626-41aa-994f-0c0740790a7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:19 crc kubenswrapper[4734]: E1205 23:39:19.179679 4734 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21a9837d_cc1d_4bf1_9b52_f9196880e367.slice/crio-e3a126dc953224f7cf69d43ba28034a6b6151fba211d245d8ef374996a56a5fa.scope\": RecentStats: unable to find data in memory cache]" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.635322 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-654798d8cb-lq2fv" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.657543 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-654798d8cb-lq2fv" event={"ID":"6a6a790b-5626-41aa-994f-0c0740790a7d","Type":"ContainerDied","Data":"f8b554734d88d9c4942018b0b5319d9559c1d03041cdb1568268413735cf4883"} Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.657601 4734 scope.go:117] "RemoveContainer" containerID="efb38390a599c517100ebac471fbf6ae2aa043331ad9aa34737375b0e2c3b959" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.680426 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.680866 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6e592729-e1ff-4707-8fd7-379eff2c5790","Type":"ContainerDied","Data":"bb1c2e3ca8f944e69f5f17b4ae76a107c07db0ce047e77284004e3153141ce5c"} Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.734203 4734 scope.go:117] "RemoveContainer" containerID="2d13d89d313484f007afec1ce6d2ebd5001f4bd6907f6366fc11c7dec8e9b6b1" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.773589 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-654798d8cb-lq2fv"] Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.778717 4734 scope.go:117] "RemoveContainer" containerID="4efbe28e57748020b3de40562cccc01604426334d908232d354b84b890c4cb1f" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.788361 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-654798d8cb-lq2fv"] Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.834588 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.840246 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.850788 4734 scope.go:117] "RemoveContainer" containerID="e4e7c0296a369512765c2d12a0791812c6be741da4f9ebe4953a2b1d67331f32" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.852667 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 05 23:39:19 crc kubenswrapper[4734]: E1205 23:39:19.853215 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e592729-e1ff-4707-8fd7-379eff2c5790" containerName="cinder-api-log" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.853242 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e592729-e1ff-4707-8fd7-379eff2c5790" containerName="cinder-api-log" Dec 05 23:39:19 crc kubenswrapper[4734]: E1205 23:39:19.853259 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e592729-e1ff-4707-8fd7-379eff2c5790" containerName="cinder-api" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.853269 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e592729-e1ff-4707-8fd7-379eff2c5790" containerName="cinder-api" Dec 05 23:39:19 crc kubenswrapper[4734]: E1205 23:39:19.853283 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a6a790b-5626-41aa-994f-0c0740790a7d" containerName="neutron-httpd" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.853291 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a6a790b-5626-41aa-994f-0c0740790a7d" containerName="neutron-httpd" Dec 05 23:39:19 crc kubenswrapper[4734]: E1205 23:39:19.853308 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a6a790b-5626-41aa-994f-0c0740790a7d" containerName="neutron-api" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.853317 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a6a790b-5626-41aa-994f-0c0740790a7d" containerName="neutron-api" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.853614 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a6a790b-5626-41aa-994f-0c0740790a7d" containerName="neutron-httpd" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.853648 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e592729-e1ff-4707-8fd7-379eff2c5790" containerName="cinder-api" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.853660 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a6a790b-5626-41aa-994f-0c0740790a7d" containerName="neutron-api" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.853679 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e592729-e1ff-4707-8fd7-379eff2c5790" containerName="cinder-api-log" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.859224 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.863012 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.863240 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.863382 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.866449 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.955788 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c79a17a-a1f1-481f-90de-cdcfe632a079-public-tls-certs\") pod \"cinder-api-0\" (UID: \"6c79a17a-a1f1-481f-90de-cdcfe632a079\") " pod="openstack/cinder-api-0" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.956357 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c79a17a-a1f1-481f-90de-cdcfe632a079-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"6c79a17a-a1f1-481f-90de-cdcfe632a079\") " pod="openstack/cinder-api-0" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.956385 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c79a17a-a1f1-481f-90de-cdcfe632a079-config-data\") pod \"cinder-api-0\" (UID: \"6c79a17a-a1f1-481f-90de-cdcfe632a079\") " pod="openstack/cinder-api-0" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.956496 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c79a17a-a1f1-481f-90de-cdcfe632a079-scripts\") pod \"cinder-api-0\" (UID: \"6c79a17a-a1f1-481f-90de-cdcfe632a079\") " pod="openstack/cinder-api-0" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.956546 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c79a17a-a1f1-481f-90de-cdcfe632a079-config-data-custom\") pod \"cinder-api-0\" (UID: \"6c79a17a-a1f1-481f-90de-cdcfe632a079\") " pod="openstack/cinder-api-0" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.956580 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5xhj\" (UniqueName: \"kubernetes.io/projected/6c79a17a-a1f1-481f-90de-cdcfe632a079-kube-api-access-w5xhj\") pod \"cinder-api-0\" (UID: \"6c79a17a-a1f1-481f-90de-cdcfe632a079\") " pod="openstack/cinder-api-0" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.956608 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6c79a17a-a1f1-481f-90de-cdcfe632a079-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6c79a17a-a1f1-481f-90de-cdcfe632a079\") " pod="openstack/cinder-api-0" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.956653 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c79a17a-a1f1-481f-90de-cdcfe632a079-logs\") pod \"cinder-api-0\" (UID: \"6c79a17a-a1f1-481f-90de-cdcfe632a079\") " pod="openstack/cinder-api-0" Dec 05 23:39:19 crc kubenswrapper[4734]: I1205 23:39:19.956745 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c79a17a-a1f1-481f-90de-cdcfe632a079-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6c79a17a-a1f1-481f-90de-cdcfe632a079\") " pod="openstack/cinder-api-0" Dec 05 23:39:20 crc kubenswrapper[4734]: I1205 23:39:20.059519 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c79a17a-a1f1-481f-90de-cdcfe632a079-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6c79a17a-a1f1-481f-90de-cdcfe632a079\") " pod="openstack/cinder-api-0" Dec 05 23:39:20 crc kubenswrapper[4734]: I1205 23:39:20.059609 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c79a17a-a1f1-481f-90de-cdcfe632a079-public-tls-certs\") pod \"cinder-api-0\" (UID: \"6c79a17a-a1f1-481f-90de-cdcfe632a079\") " pod="openstack/cinder-api-0" Dec 05 23:39:20 crc kubenswrapper[4734]: I1205 23:39:20.059635 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c79a17a-a1f1-481f-90de-cdcfe632a079-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"6c79a17a-a1f1-481f-90de-cdcfe632a079\") " pod="openstack/cinder-api-0" Dec 05 23:39:20 crc kubenswrapper[4734]: I1205 23:39:20.059655 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c79a17a-a1f1-481f-90de-cdcfe632a079-config-data\") pod \"cinder-api-0\" (UID: \"6c79a17a-a1f1-481f-90de-cdcfe632a079\") " pod="openstack/cinder-api-0" Dec 05 23:39:20 crc kubenswrapper[4734]: I1205 23:39:20.059729 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c79a17a-a1f1-481f-90de-cdcfe632a079-scripts\") pod \"cinder-api-0\" (UID: \"6c79a17a-a1f1-481f-90de-cdcfe632a079\") " pod="openstack/cinder-api-0" Dec 05 23:39:20 crc kubenswrapper[4734]: I1205 23:39:20.059751 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c79a17a-a1f1-481f-90de-cdcfe632a079-config-data-custom\") pod \"cinder-api-0\" (UID: \"6c79a17a-a1f1-481f-90de-cdcfe632a079\") " pod="openstack/cinder-api-0" Dec 05 23:39:20 crc kubenswrapper[4734]: I1205 23:39:20.059776 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5xhj\" (UniqueName: \"kubernetes.io/projected/6c79a17a-a1f1-481f-90de-cdcfe632a079-kube-api-access-w5xhj\") pod \"cinder-api-0\" (UID: \"6c79a17a-a1f1-481f-90de-cdcfe632a079\") " pod="openstack/cinder-api-0" Dec 05 23:39:20 crc kubenswrapper[4734]: I1205 23:39:20.059795 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6c79a17a-a1f1-481f-90de-cdcfe632a079-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6c79a17a-a1f1-481f-90de-cdcfe632a079\") " pod="openstack/cinder-api-0" Dec 05 23:39:20 crc kubenswrapper[4734]: I1205 23:39:20.059823 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c79a17a-a1f1-481f-90de-cdcfe632a079-logs\") pod \"cinder-api-0\" (UID: \"6c79a17a-a1f1-481f-90de-cdcfe632a079\") " pod="openstack/cinder-api-0" Dec 05 23:39:20 crc kubenswrapper[4734]: I1205 23:39:20.060388 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c79a17a-a1f1-481f-90de-cdcfe632a079-logs\") pod \"cinder-api-0\" (UID: \"6c79a17a-a1f1-481f-90de-cdcfe632a079\") " pod="openstack/cinder-api-0" Dec 05 23:39:20 crc kubenswrapper[4734]: I1205 23:39:20.062630 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6c79a17a-a1f1-481f-90de-cdcfe632a079-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6c79a17a-a1f1-481f-90de-cdcfe632a079\") " pod="openstack/cinder-api-0" Dec 05 23:39:20 crc kubenswrapper[4734]: I1205 23:39:20.070017 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c79a17a-a1f1-481f-90de-cdcfe632a079-scripts\") pod \"cinder-api-0\" (UID: \"6c79a17a-a1f1-481f-90de-cdcfe632a079\") " pod="openstack/cinder-api-0" Dec 05 23:39:20 crc kubenswrapper[4734]: I1205 23:39:20.074142 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c79a17a-a1f1-481f-90de-cdcfe632a079-config-data-custom\") pod \"cinder-api-0\" (UID: \"6c79a17a-a1f1-481f-90de-cdcfe632a079\") " pod="openstack/cinder-api-0" Dec 05 23:39:20 crc kubenswrapper[4734]: I1205 23:39:20.079668 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c79a17a-a1f1-481f-90de-cdcfe632a079-config-data\") pod \"cinder-api-0\" (UID: \"6c79a17a-a1f1-481f-90de-cdcfe632a079\") " pod="openstack/cinder-api-0" Dec 05 23:39:20 crc kubenswrapper[4734]: I1205 23:39:20.080400 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c79a17a-a1f1-481f-90de-cdcfe632a079-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6c79a17a-a1f1-481f-90de-cdcfe632a079\") " pod="openstack/cinder-api-0" Dec 05 23:39:20 crc kubenswrapper[4734]: I1205 23:39:20.081308 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c79a17a-a1f1-481f-90de-cdcfe632a079-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"6c79a17a-a1f1-481f-90de-cdcfe632a079\") " pod="openstack/cinder-api-0" Dec 05 23:39:20 crc kubenswrapper[4734]: I1205 23:39:20.083651 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5xhj\" (UniqueName: \"kubernetes.io/projected/6c79a17a-a1f1-481f-90de-cdcfe632a079-kube-api-access-w5xhj\") pod \"cinder-api-0\" (UID: \"6c79a17a-a1f1-481f-90de-cdcfe632a079\") " pod="openstack/cinder-api-0" Dec 05 23:39:20 crc kubenswrapper[4734]: I1205 23:39:20.083942 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c79a17a-a1f1-481f-90de-cdcfe632a079-public-tls-certs\") pod \"cinder-api-0\" (UID: \"6c79a17a-a1f1-481f-90de-cdcfe632a079\") " pod="openstack/cinder-api-0" Dec 05 23:39:20 crc kubenswrapper[4734]: I1205 23:39:20.190748 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 23:39:20 crc kubenswrapper[4734]: I1205 23:39:20.445339 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:39:20 crc kubenswrapper[4734]: I1205 23:39:20.445408 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:39:20 crc kubenswrapper[4734]: I1205 23:39:20.829646 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 23:39:21 crc kubenswrapper[4734]: I1205 23:39:21.241306 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-fffd48d8f-srcmr" Dec 05 23:39:21 crc kubenswrapper[4734]: I1205 23:39:21.578552 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 05 23:39:21 crc kubenswrapper[4734]: I1205 23:39:21.641628 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a6a790b-5626-41aa-994f-0c0740790a7d" path="/var/lib/kubelet/pods/6a6a790b-5626-41aa-994f-0c0740790a7d/volumes" Dec 05 23:39:21 crc kubenswrapper[4734]: I1205 23:39:21.642639 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e592729-e1ff-4707-8fd7-379eff2c5790" path="/var/lib/kubelet/pods/6e592729-e1ff-4707-8fd7-379eff2c5790/volumes" Dec 05 23:39:21 crc kubenswrapper[4734]: I1205 23:39:21.684110 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-75b7946cc8-hzcp7" Dec 05 23:39:21 crc kubenswrapper[4734]: I1205 23:39:21.687677 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" Dec 05 23:39:21 crc kubenswrapper[4734]: I1205 23:39:21.753004 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6c79a17a-a1f1-481f-90de-cdcfe632a079","Type":"ContainerStarted","Data":"086edb26aa83da32aa0511a9af6833c6cdd2d843c9eb1854d9bee8ca896826a4"} Dec 05 23:39:21 crc kubenswrapper[4734]: I1205 23:39:21.892830 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-gxrch"] Dec 05 23:39:21 crc kubenswrapper[4734]: I1205 23:39:21.893188 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-gxrch" podUID="de3b13fb-9708-44af-bd09-f9be8514121e" containerName="dnsmasq-dns" containerID="cri-o://bd86aa3a14b4c4ea3568fc35224329174b7da35ecc2b98b714bb50e2a2945b9f" gracePeriod=10 Dec 05 23:39:21 crc kubenswrapper[4734]: I1205 23:39:21.900284 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5d469948dd-n7t4x" podUID="c96cd173-4707-4edc-a92e-35db297082e2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Dec 05 23:39:22 crc kubenswrapper[4734]: I1205 23:39:22.161053 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-85fbcb99c8-4gdvt" Dec 05 23:39:22 crc kubenswrapper[4734]: I1205 23:39:22.463177 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 05 23:39:22 crc kubenswrapper[4734]: I1205 23:39:22.562359 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 23:39:22 crc kubenswrapper[4734]: I1205 23:39:22.656196 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-75b7946cc8-hzcp7" Dec 05 23:39:22 crc kubenswrapper[4734]: I1205 23:39:22.798923 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-gxrch" Dec 05 23:39:22 crc kubenswrapper[4734]: I1205 23:39:22.818002 4734 generic.go:334] "Generic (PLEG): container finished" podID="c96cd173-4707-4edc-a92e-35db297082e2" containerID="9514774e87490121e03a56ce448d088925b7977bc46c6a8cf0102e7f1954b7d8" exitCode=0 Dec 05 23:39:22 crc kubenswrapper[4734]: I1205 23:39:22.818093 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d469948dd-n7t4x" event={"ID":"c96cd173-4707-4edc-a92e-35db297082e2","Type":"ContainerDied","Data":"9514774e87490121e03a56ce448d088925b7977bc46c6a8cf0102e7f1954b7d8"} Dec 05 23:39:22 crc kubenswrapper[4734]: I1205 23:39:22.899689 4734 generic.go:334] "Generic (PLEG): container finished" podID="de3b13fb-9708-44af-bd09-f9be8514121e" containerID="bd86aa3a14b4c4ea3568fc35224329174b7da35ecc2b98b714bb50e2a2945b9f" exitCode=0 Dec 05 23:39:22 crc kubenswrapper[4734]: I1205 23:39:22.899769 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-gxrch" event={"ID":"de3b13fb-9708-44af-bd09-f9be8514121e","Type":"ContainerDied","Data":"bd86aa3a14b4c4ea3568fc35224329174b7da35ecc2b98b714bb50e2a2945b9f"} Dec 05 23:39:22 crc kubenswrapper[4734]: I1205 23:39:22.899805 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-gxrch" event={"ID":"de3b13fb-9708-44af-bd09-f9be8514121e","Type":"ContainerDied","Data":"139416cd88ae51b920b2e6aada6252c49461a7153b9e5477dcbc4e64e5e09f2f"} Dec 05 23:39:22 crc kubenswrapper[4734]: I1205 23:39:22.899826 4734 scope.go:117] "RemoveContainer" containerID="bd86aa3a14b4c4ea3568fc35224329174b7da35ecc2b98b714bb50e2a2945b9f" Dec 05 23:39:22 crc kubenswrapper[4734]: I1205 23:39:22.899977 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-gxrch" Dec 05 23:39:22 crc kubenswrapper[4734]: I1205 23:39:22.924268 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e2f773bf-b6a2-4f90-bf47-f3bd63431381" containerName="cinder-scheduler" containerID="cri-o://fe4d8d710b2e9360a934579a2cdd9ae758e44c010f3a43db871e05efcb6663a9" gracePeriod=30 Dec 05 23:39:22 crc kubenswrapper[4734]: I1205 23:39:22.924646 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6c79a17a-a1f1-481f-90de-cdcfe632a079","Type":"ContainerStarted","Data":"def6ca3c9cab662d2a93200638ad1f55a6ab17c7bc66d75ce9c2da1d4653e300"} Dec 05 23:39:22 crc kubenswrapper[4734]: I1205 23:39:22.925008 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e2f773bf-b6a2-4f90-bf47-f3bd63431381" containerName="probe" containerID="cri-o://113ed975ba333f7b50bd27ac66d62f813eea9cac077c729a734a72647d330de8" gracePeriod=30 Dec 05 23:39:22 crc kubenswrapper[4734]: I1205 23:39:22.960598 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-ovsdbserver-sb\") pod \"de3b13fb-9708-44af-bd09-f9be8514121e\" (UID: \"de3b13fb-9708-44af-bd09-f9be8514121e\") " Dec 05 23:39:22 crc kubenswrapper[4734]: I1205 23:39:22.960662 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-config\") pod \"de3b13fb-9708-44af-bd09-f9be8514121e\" (UID: \"de3b13fb-9708-44af-bd09-f9be8514121e\") " Dec 05 23:39:22 crc kubenswrapper[4734]: I1205 23:39:22.960695 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-ovsdbserver-nb\") pod \"de3b13fb-9708-44af-bd09-f9be8514121e\" (UID: \"de3b13fb-9708-44af-bd09-f9be8514121e\") " Dec 05 23:39:22 crc kubenswrapper[4734]: I1205 23:39:22.960716 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-dns-swift-storage-0\") pod \"de3b13fb-9708-44af-bd09-f9be8514121e\" (UID: \"de3b13fb-9708-44af-bd09-f9be8514121e\") " Dec 05 23:39:22 crc kubenswrapper[4734]: I1205 23:39:22.960782 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhz68\" (UniqueName: \"kubernetes.io/projected/de3b13fb-9708-44af-bd09-f9be8514121e-kube-api-access-rhz68\") pod \"de3b13fb-9708-44af-bd09-f9be8514121e\" (UID: \"de3b13fb-9708-44af-bd09-f9be8514121e\") " Dec 05 23:39:22 crc kubenswrapper[4734]: I1205 23:39:22.960902 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-dns-svc\") pod \"de3b13fb-9708-44af-bd09-f9be8514121e\" (UID: \"de3b13fb-9708-44af-bd09-f9be8514121e\") " Dec 05 23:39:23 crc kubenswrapper[4734]: I1205 23:39:23.002199 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de3b13fb-9708-44af-bd09-f9be8514121e-kube-api-access-rhz68" (OuterVolumeSpecName: "kube-api-access-rhz68") pod "de3b13fb-9708-44af-bd09-f9be8514121e" (UID: "de3b13fb-9708-44af-bd09-f9be8514121e"). InnerVolumeSpecName "kube-api-access-rhz68". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:39:23 crc kubenswrapper[4734]: I1205 23:39:23.034600 4734 scope.go:117] "RemoveContainer" containerID="b58110ceaa8ee50a0007f940b537cb62ec0f23aa523c39b26941fb5154296dc7" Dec 05 23:39:23 crc kubenswrapper[4734]: I1205 23:39:23.062958 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhz68\" (UniqueName: \"kubernetes.io/projected/de3b13fb-9708-44af-bd09-f9be8514121e-kube-api-access-rhz68\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:23 crc kubenswrapper[4734]: I1205 23:39:23.123504 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "de3b13fb-9708-44af-bd09-f9be8514121e" (UID: "de3b13fb-9708-44af-bd09-f9be8514121e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:39:23 crc kubenswrapper[4734]: I1205 23:39:23.145473 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-config" (OuterVolumeSpecName: "config") pod "de3b13fb-9708-44af-bd09-f9be8514121e" (UID: "de3b13fb-9708-44af-bd09-f9be8514121e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:39:23 crc kubenswrapper[4734]: I1205 23:39:23.145836 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "de3b13fb-9708-44af-bd09-f9be8514121e" (UID: "de3b13fb-9708-44af-bd09-f9be8514121e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:39:23 crc kubenswrapper[4734]: I1205 23:39:23.147390 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "de3b13fb-9708-44af-bd09-f9be8514121e" (UID: "de3b13fb-9708-44af-bd09-f9be8514121e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:39:23 crc kubenswrapper[4734]: I1205 23:39:23.165870 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:23 crc kubenswrapper[4734]: I1205 23:39:23.166036 4734 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:23 crc kubenswrapper[4734]: I1205 23:39:23.166053 4734 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:23 crc kubenswrapper[4734]: I1205 23:39:23.166062 4734 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:23 crc kubenswrapper[4734]: I1205 23:39:23.210675 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "de3b13fb-9708-44af-bd09-f9be8514121e" (UID: "de3b13fb-9708-44af-bd09-f9be8514121e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:39:23 crc kubenswrapper[4734]: I1205 23:39:23.268187 4734 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de3b13fb-9708-44af-bd09-f9be8514121e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:23 crc kubenswrapper[4734]: I1205 23:39:23.351932 4734 scope.go:117] "RemoveContainer" containerID="bd86aa3a14b4c4ea3568fc35224329174b7da35ecc2b98b714bb50e2a2945b9f" Dec 05 23:39:23 crc kubenswrapper[4734]: E1205 23:39:23.354874 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd86aa3a14b4c4ea3568fc35224329174b7da35ecc2b98b714bb50e2a2945b9f\": container with ID starting with bd86aa3a14b4c4ea3568fc35224329174b7da35ecc2b98b714bb50e2a2945b9f not found: ID does not exist" containerID="bd86aa3a14b4c4ea3568fc35224329174b7da35ecc2b98b714bb50e2a2945b9f" Dec 05 23:39:23 crc kubenswrapper[4734]: I1205 23:39:23.354914 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd86aa3a14b4c4ea3568fc35224329174b7da35ecc2b98b714bb50e2a2945b9f"} err="failed to get container status \"bd86aa3a14b4c4ea3568fc35224329174b7da35ecc2b98b714bb50e2a2945b9f\": rpc error: code = NotFound desc = could not find container \"bd86aa3a14b4c4ea3568fc35224329174b7da35ecc2b98b714bb50e2a2945b9f\": container with ID starting with bd86aa3a14b4c4ea3568fc35224329174b7da35ecc2b98b714bb50e2a2945b9f not found: ID does not exist" Dec 05 23:39:23 crc kubenswrapper[4734]: I1205 23:39:23.354943 4734 scope.go:117] "RemoveContainer" containerID="b58110ceaa8ee50a0007f940b537cb62ec0f23aa523c39b26941fb5154296dc7" Dec 05 23:39:23 crc kubenswrapper[4734]: E1205 23:39:23.356016 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b58110ceaa8ee50a0007f940b537cb62ec0f23aa523c39b26941fb5154296dc7\": container with ID starting with b58110ceaa8ee50a0007f940b537cb62ec0f23aa523c39b26941fb5154296dc7 not found: ID does not exist" containerID="b58110ceaa8ee50a0007f940b537cb62ec0f23aa523c39b26941fb5154296dc7" Dec 05 23:39:23 crc kubenswrapper[4734]: I1205 23:39:23.356048 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b58110ceaa8ee50a0007f940b537cb62ec0f23aa523c39b26941fb5154296dc7"} err="failed to get container status \"b58110ceaa8ee50a0007f940b537cb62ec0f23aa523c39b26941fb5154296dc7\": rpc error: code = NotFound desc = could not find container \"b58110ceaa8ee50a0007f940b537cb62ec0f23aa523c39b26941fb5154296dc7\": container with ID starting with b58110ceaa8ee50a0007f940b537cb62ec0f23aa523c39b26941fb5154296dc7 not found: ID does not exist" Dec 05 23:39:23 crc kubenswrapper[4734]: I1205 23:39:23.538072 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-gxrch"] Dec 05 23:39:23 crc kubenswrapper[4734]: I1205 23:39:23.570663 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-gxrch"] Dec 05 23:39:23 crc kubenswrapper[4734]: I1205 23:39:23.641988 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de3b13fb-9708-44af-bd09-f9be8514121e" path="/var/lib/kubelet/pods/de3b13fb-9708-44af-bd09-f9be8514121e/volumes" Dec 05 23:39:23 crc kubenswrapper[4734]: I1205 23:39:23.937158 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6c79a17a-a1f1-481f-90de-cdcfe632a079","Type":"ContainerStarted","Data":"587d8bee4ec171414f1bb1e654addaefd7902526581dedc35f24694923f7a465"} Dec 05 23:39:23 crc kubenswrapper[4734]: I1205 23:39:23.937324 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 05 23:39:23 crc kubenswrapper[4734]: I1205 23:39:23.939630 4734 generic.go:334] "Generic (PLEG): container finished" podID="e2f773bf-b6a2-4f90-bf47-f3bd63431381" containerID="113ed975ba333f7b50bd27ac66d62f813eea9cac077c729a734a72647d330de8" exitCode=0 Dec 05 23:39:23 crc kubenswrapper[4734]: I1205 23:39:23.939748 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e2f773bf-b6a2-4f90-bf47-f3bd63431381","Type":"ContainerDied","Data":"113ed975ba333f7b50bd27ac66d62f813eea9cac077c729a734a72647d330de8"} Dec 05 23:39:23 crc kubenswrapper[4734]: I1205 23:39:23.972326 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.9722921190000005 podStartE2EDuration="4.972292119s" podCreationTimestamp="2025-12-05 23:39:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:39:23.959285704 +0000 UTC m=+1184.642689970" watchObservedRunningTime="2025-12-05 23:39:23.972292119 +0000 UTC m=+1184.655696395" Dec 05 23:39:25 crc kubenswrapper[4734]: I1205 23:39:25.350989 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 05 23:39:25 crc kubenswrapper[4734]: E1205 23:39:25.351392 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de3b13fb-9708-44af-bd09-f9be8514121e" containerName="dnsmasq-dns" Dec 05 23:39:25 crc kubenswrapper[4734]: I1205 23:39:25.351407 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="de3b13fb-9708-44af-bd09-f9be8514121e" containerName="dnsmasq-dns" Dec 05 23:39:25 crc kubenswrapper[4734]: E1205 23:39:25.351444 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de3b13fb-9708-44af-bd09-f9be8514121e" containerName="init" Dec 05 23:39:25 crc kubenswrapper[4734]: I1205 23:39:25.351451 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="de3b13fb-9708-44af-bd09-f9be8514121e" containerName="init" Dec 05 23:39:25 crc kubenswrapper[4734]: I1205 23:39:25.354785 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="de3b13fb-9708-44af-bd09-f9be8514121e" containerName="dnsmasq-dns" Dec 05 23:39:25 crc kubenswrapper[4734]: I1205 23:39:25.355576 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 23:39:25 crc kubenswrapper[4734]: I1205 23:39:25.358231 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 05 23:39:25 crc kubenswrapper[4734]: I1205 23:39:25.359547 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-qrwgh" Dec 05 23:39:25 crc kubenswrapper[4734]: I1205 23:39:25.370849 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 23:39:25 crc kubenswrapper[4734]: I1205 23:39:25.372066 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 05 23:39:25 crc kubenswrapper[4734]: I1205 23:39:25.522565 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1538ece1-e24d-4f20-b92d-0b526d1f5698-openstack-config\") pod \"openstackclient\" (UID: \"1538ece1-e24d-4f20-b92d-0b526d1f5698\") " pod="openstack/openstackclient" Dec 05 23:39:25 crc kubenswrapper[4734]: I1205 23:39:25.522624 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1538ece1-e24d-4f20-b92d-0b526d1f5698-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1538ece1-e24d-4f20-b92d-0b526d1f5698\") " pod="openstack/openstackclient" Dec 05 23:39:25 crc kubenswrapper[4734]: I1205 23:39:25.522652 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gndlg\" (UniqueName: \"kubernetes.io/projected/1538ece1-e24d-4f20-b92d-0b526d1f5698-kube-api-access-gndlg\") pod \"openstackclient\" (UID: \"1538ece1-e24d-4f20-b92d-0b526d1f5698\") " pod="openstack/openstackclient" Dec 05 23:39:25 crc kubenswrapper[4734]: I1205 23:39:25.522689 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1538ece1-e24d-4f20-b92d-0b526d1f5698-openstack-config-secret\") pod \"openstackclient\" (UID: \"1538ece1-e24d-4f20-b92d-0b526d1f5698\") " pod="openstack/openstackclient" Dec 05 23:39:25 crc kubenswrapper[4734]: I1205 23:39:25.624426 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1538ece1-e24d-4f20-b92d-0b526d1f5698-openstack-config\") pod \"openstackclient\" (UID: \"1538ece1-e24d-4f20-b92d-0b526d1f5698\") " pod="openstack/openstackclient" Dec 05 23:39:25 crc kubenswrapper[4734]: I1205 23:39:25.624843 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1538ece1-e24d-4f20-b92d-0b526d1f5698-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1538ece1-e24d-4f20-b92d-0b526d1f5698\") " pod="openstack/openstackclient" Dec 05 23:39:25 crc kubenswrapper[4734]: I1205 23:39:25.624973 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gndlg\" (UniqueName: \"kubernetes.io/projected/1538ece1-e24d-4f20-b92d-0b526d1f5698-kube-api-access-gndlg\") pod \"openstackclient\" (UID: \"1538ece1-e24d-4f20-b92d-0b526d1f5698\") " pod="openstack/openstackclient" Dec 05 23:39:25 crc kubenswrapper[4734]: I1205 23:39:25.625125 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1538ece1-e24d-4f20-b92d-0b526d1f5698-openstack-config-secret\") pod \"openstackclient\" (UID: \"1538ece1-e24d-4f20-b92d-0b526d1f5698\") " pod="openstack/openstackclient" Dec 05 23:39:25 crc kubenswrapper[4734]: I1205 23:39:25.626440 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1538ece1-e24d-4f20-b92d-0b526d1f5698-openstack-config\") pod \"openstackclient\" (UID: \"1538ece1-e24d-4f20-b92d-0b526d1f5698\") " pod="openstack/openstackclient" Dec 05 23:39:25 crc kubenswrapper[4734]: I1205 23:39:25.634362 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1538ece1-e24d-4f20-b92d-0b526d1f5698-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1538ece1-e24d-4f20-b92d-0b526d1f5698\") " pod="openstack/openstackclient" Dec 05 23:39:25 crc kubenswrapper[4734]: I1205 23:39:25.641075 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1538ece1-e24d-4f20-b92d-0b526d1f5698-openstack-config-secret\") pod \"openstackclient\" (UID: \"1538ece1-e24d-4f20-b92d-0b526d1f5698\") " pod="openstack/openstackclient" Dec 05 23:39:25 crc kubenswrapper[4734]: I1205 23:39:25.650087 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gndlg\" (UniqueName: \"kubernetes.io/projected/1538ece1-e24d-4f20-b92d-0b526d1f5698-kube-api-access-gndlg\") pod \"openstackclient\" (UID: \"1538ece1-e24d-4f20-b92d-0b526d1f5698\") " pod="openstack/openstackclient" Dec 05 23:39:25 crc kubenswrapper[4734]: I1205 23:39:25.689810 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 23:39:26 crc kubenswrapper[4734]: I1205 23:39:26.025436 4734 generic.go:334] "Generic (PLEG): container finished" podID="e2f773bf-b6a2-4f90-bf47-f3bd63431381" containerID="fe4d8d710b2e9360a934579a2cdd9ae758e44c010f3a43db871e05efcb6663a9" exitCode=0 Dec 05 23:39:26 crc kubenswrapper[4734]: I1205 23:39:26.025625 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e2f773bf-b6a2-4f90-bf47-f3bd63431381","Type":"ContainerDied","Data":"fe4d8d710b2e9360a934579a2cdd9ae758e44c010f3a43db871e05efcb6663a9"} Dec 05 23:39:26 crc kubenswrapper[4734]: I1205 23:39:26.272810 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 23:39:26 crc kubenswrapper[4734]: I1205 23:39:26.348240 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96dl9\" (UniqueName: \"kubernetes.io/projected/e2f773bf-b6a2-4f90-bf47-f3bd63431381-kube-api-access-96dl9\") pod \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\" (UID: \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\") " Dec 05 23:39:26 crc kubenswrapper[4734]: I1205 23:39:26.348410 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f773bf-b6a2-4f90-bf47-f3bd63431381-config-data\") pod \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\" (UID: \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\") " Dec 05 23:39:26 crc kubenswrapper[4734]: I1205 23:39:26.348458 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2f773bf-b6a2-4f90-bf47-f3bd63431381-scripts\") pod \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\" (UID: \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\") " Dec 05 23:39:26 crc kubenswrapper[4734]: I1205 23:39:26.348590 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2f773bf-b6a2-4f90-bf47-f3bd63431381-config-data-custom\") pod \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\" (UID: \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\") " Dec 05 23:39:26 crc kubenswrapper[4734]: I1205 23:39:26.348631 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f773bf-b6a2-4f90-bf47-f3bd63431381-combined-ca-bundle\") pod \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\" (UID: \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\") " Dec 05 23:39:26 crc kubenswrapper[4734]: I1205 23:39:26.348737 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2f773bf-b6a2-4f90-bf47-f3bd63431381-etc-machine-id\") pod \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\" (UID: \"e2f773bf-b6a2-4f90-bf47-f3bd63431381\") " Dec 05 23:39:26 crc kubenswrapper[4734]: I1205 23:39:26.349402 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2f773bf-b6a2-4f90-bf47-f3bd63431381-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e2f773bf-b6a2-4f90-bf47-f3bd63431381" (UID: "e2f773bf-b6a2-4f90-bf47-f3bd63431381"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 23:39:26 crc kubenswrapper[4734]: I1205 23:39:26.361017 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2f773bf-b6a2-4f90-bf47-f3bd63431381-kube-api-access-96dl9" (OuterVolumeSpecName: "kube-api-access-96dl9") pod "e2f773bf-b6a2-4f90-bf47-f3bd63431381" (UID: "e2f773bf-b6a2-4f90-bf47-f3bd63431381"). InnerVolumeSpecName "kube-api-access-96dl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:39:26 crc kubenswrapper[4734]: I1205 23:39:26.368763 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2f773bf-b6a2-4f90-bf47-f3bd63431381-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e2f773bf-b6a2-4f90-bf47-f3bd63431381" (UID: "e2f773bf-b6a2-4f90-bf47-f3bd63431381"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:26 crc kubenswrapper[4734]: I1205 23:39:26.374809 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2f773bf-b6a2-4f90-bf47-f3bd63431381-scripts" (OuterVolumeSpecName: "scripts") pod "e2f773bf-b6a2-4f90-bf47-f3bd63431381" (UID: "e2f773bf-b6a2-4f90-bf47-f3bd63431381"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:26 crc kubenswrapper[4734]: I1205 23:39:26.387352 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 23:39:26 crc kubenswrapper[4734]: I1205 23:39:26.424653 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2f773bf-b6a2-4f90-bf47-f3bd63431381-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2f773bf-b6a2-4f90-bf47-f3bd63431381" (UID: "e2f773bf-b6a2-4f90-bf47-f3bd63431381"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:26 crc kubenswrapper[4734]: I1205 23:39:26.455460 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96dl9\" (UniqueName: \"kubernetes.io/projected/e2f773bf-b6a2-4f90-bf47-f3bd63431381-kube-api-access-96dl9\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:26 crc kubenswrapper[4734]: I1205 23:39:26.455505 4734 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2f773bf-b6a2-4f90-bf47-f3bd63431381-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:26 crc kubenswrapper[4734]: I1205 23:39:26.455516 4734 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2f773bf-b6a2-4f90-bf47-f3bd63431381-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:26 crc kubenswrapper[4734]: I1205 23:39:26.455541 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f773bf-b6a2-4f90-bf47-f3bd63431381-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:26 crc kubenswrapper[4734]: I1205 23:39:26.455551 4734 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2f773bf-b6a2-4f90-bf47-f3bd63431381-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:26 crc kubenswrapper[4734]: I1205 23:39:26.550178 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2f773bf-b6a2-4f90-bf47-f3bd63431381-config-data" (OuterVolumeSpecName: "config-data") pod "e2f773bf-b6a2-4f90-bf47-f3bd63431381" (UID: "e2f773bf-b6a2-4f90-bf47-f3bd63431381"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:26 crc kubenswrapper[4734]: I1205 23:39:26.557148 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f773bf-b6a2-4f90-bf47-f3bd63431381-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.056306 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e2f773bf-b6a2-4f90-bf47-f3bd63431381","Type":"ContainerDied","Data":"f1b38848a33ae411fa49cda4e8d6f2b947c7bc712936ac9ab8ae6e8ea5e02cce"} Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.056373 4734 scope.go:117] "RemoveContainer" containerID="113ed975ba333f7b50bd27ac66d62f813eea9cac077c729a734a72647d330de8" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.056333 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.071750 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"1538ece1-e24d-4f20-b92d-0b526d1f5698","Type":"ContainerStarted","Data":"d61b7ad1e44987dc7420dff430797e3092d7cafa8ba2063ee84275ca93d8740d"} Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.131797 4734 scope.go:117] "RemoveContainer" containerID="fe4d8d710b2e9360a934579a2cdd9ae758e44c010f3a43db871e05efcb6663a9" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.134878 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.149128 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.163450 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 23:39:27 crc kubenswrapper[4734]: E1205 23:39:27.164087 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f773bf-b6a2-4f90-bf47-f3bd63431381" containerName="cinder-scheduler" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.164103 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f773bf-b6a2-4f90-bf47-f3bd63431381" containerName="cinder-scheduler" Dec 05 23:39:27 crc kubenswrapper[4734]: E1205 23:39:27.164118 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f773bf-b6a2-4f90-bf47-f3bd63431381" containerName="probe" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.164125 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f773bf-b6a2-4f90-bf47-f3bd63431381" containerName="probe" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.164302 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2f773bf-b6a2-4f90-bf47-f3bd63431381" containerName="cinder-scheduler" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.164324 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2f773bf-b6a2-4f90-bf47-f3bd63431381" containerName="probe" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.165563 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.170004 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.174246 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.279257 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d36cae72-5806-4d9c-80a9-c396c5ca00d6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d36cae72-5806-4d9c-80a9-c396c5ca00d6\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.279332 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d36cae72-5806-4d9c-80a9-c396c5ca00d6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d36cae72-5806-4d9c-80a9-c396c5ca00d6\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.279382 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36cae72-5806-4d9c-80a9-c396c5ca00d6-config-data\") pod \"cinder-scheduler-0\" (UID: \"d36cae72-5806-4d9c-80a9-c396c5ca00d6\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.279403 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36cae72-5806-4d9c-80a9-c396c5ca00d6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d36cae72-5806-4d9c-80a9-c396c5ca00d6\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.279423 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d36cae72-5806-4d9c-80a9-c396c5ca00d6-scripts\") pod \"cinder-scheduler-0\" (UID: \"d36cae72-5806-4d9c-80a9-c396c5ca00d6\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.279445 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmgtx\" (UniqueName: \"kubernetes.io/projected/d36cae72-5806-4d9c-80a9-c396c5ca00d6-kube-api-access-xmgtx\") pod \"cinder-scheduler-0\" (UID: \"d36cae72-5806-4d9c-80a9-c396c5ca00d6\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.380971 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d36cae72-5806-4d9c-80a9-c396c5ca00d6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d36cae72-5806-4d9c-80a9-c396c5ca00d6\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.381124 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d36cae72-5806-4d9c-80a9-c396c5ca00d6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d36cae72-5806-4d9c-80a9-c396c5ca00d6\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.381573 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36cae72-5806-4d9c-80a9-c396c5ca00d6-config-data\") pod \"cinder-scheduler-0\" (UID: \"d36cae72-5806-4d9c-80a9-c396c5ca00d6\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.381601 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36cae72-5806-4d9c-80a9-c396c5ca00d6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d36cae72-5806-4d9c-80a9-c396c5ca00d6\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.381648 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d36cae72-5806-4d9c-80a9-c396c5ca00d6-scripts\") pod \"cinder-scheduler-0\" (UID: \"d36cae72-5806-4d9c-80a9-c396c5ca00d6\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.381665 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmgtx\" (UniqueName: \"kubernetes.io/projected/d36cae72-5806-4d9c-80a9-c396c5ca00d6-kube-api-access-xmgtx\") pod \"cinder-scheduler-0\" (UID: \"d36cae72-5806-4d9c-80a9-c396c5ca00d6\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.382715 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d36cae72-5806-4d9c-80a9-c396c5ca00d6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d36cae72-5806-4d9c-80a9-c396c5ca00d6\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.390589 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d36cae72-5806-4d9c-80a9-c396c5ca00d6-scripts\") pod \"cinder-scheduler-0\" (UID: \"d36cae72-5806-4d9c-80a9-c396c5ca00d6\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.391631 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36cae72-5806-4d9c-80a9-c396c5ca00d6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d36cae72-5806-4d9c-80a9-c396c5ca00d6\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.392995 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36cae72-5806-4d9c-80a9-c396c5ca00d6-config-data\") pod \"cinder-scheduler-0\" (UID: \"d36cae72-5806-4d9c-80a9-c396c5ca00d6\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.406700 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d36cae72-5806-4d9c-80a9-c396c5ca00d6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d36cae72-5806-4d9c-80a9-c396c5ca00d6\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.424002 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmgtx\" (UniqueName: \"kubernetes.io/projected/d36cae72-5806-4d9c-80a9-c396c5ca00d6-kube-api-access-xmgtx\") pod \"cinder-scheduler-0\" (UID: \"d36cae72-5806-4d9c-80a9-c396c5ca00d6\") " pod="openstack/cinder-scheduler-0" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.606335 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 23:39:27 crc kubenswrapper[4734]: I1205 23:39:27.642198 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2f773bf-b6a2-4f90-bf47-f3bd63431381" path="/var/lib/kubelet/pods/e2f773bf-b6a2-4f90-bf47-f3bd63431381/volumes" Dec 05 23:39:28 crc kubenswrapper[4734]: I1205 23:39:28.218120 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 23:39:28 crc kubenswrapper[4734]: I1205 23:39:28.637492 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5574c9fdf8-q682b" Dec 05 23:39:28 crc kubenswrapper[4734]: I1205 23:39:28.857649 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5574c9fdf8-q682b" Dec 05 23:39:28 crc kubenswrapper[4734]: I1205 23:39:28.948462 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-75b7946cc8-hzcp7"] Dec 05 23:39:28 crc kubenswrapper[4734]: I1205 23:39:28.948754 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-75b7946cc8-hzcp7" podUID="4093fb52-8433-4e30-9a08-9fc77fb5d49e" containerName="barbican-api-log" containerID="cri-o://bd514e4b2369b6ed51bbdc649f68e8ac1b180b1bf9b9530da65841f4c089923a" gracePeriod=30 Dec 05 23:39:28 crc kubenswrapper[4734]: I1205 23:39:28.948902 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-75b7946cc8-hzcp7" podUID="4093fb52-8433-4e30-9a08-9fc77fb5d49e" containerName="barbican-api" containerID="cri-o://130367aed39c766db24d0825b622516d6a6256618ee3738e9e1354f424f5a5a4" gracePeriod=30 Dec 05 23:39:28 crc kubenswrapper[4734]: I1205 23:39:28.982733 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-75b7946cc8-hzcp7" podUID="4093fb52-8433-4e30-9a08-9fc77fb5d49e" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": EOF" Dec 05 23:39:29 crc kubenswrapper[4734]: I1205 23:39:29.126114 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d36cae72-5806-4d9c-80a9-c396c5ca00d6","Type":"ContainerStarted","Data":"e7b804f5040ae6b53f83ea4883da68c6326bb5a4ccb800c91252acd2ffbab8fd"} Dec 05 23:39:29 crc kubenswrapper[4734]: I1205 23:39:29.143800 4734 generic.go:334] "Generic (PLEG): container finished" podID="4093fb52-8433-4e30-9a08-9fc77fb5d49e" containerID="bd514e4b2369b6ed51bbdc649f68e8ac1b180b1bf9b9530da65841f4c089923a" exitCode=143 Dec 05 23:39:29 crc kubenswrapper[4734]: I1205 23:39:29.143862 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75b7946cc8-hzcp7" event={"ID":"4093fb52-8433-4e30-9a08-9fc77fb5d49e","Type":"ContainerDied","Data":"bd514e4b2369b6ed51bbdc649f68e8ac1b180b1bf9b9530da65841f4c089923a"} Dec 05 23:39:29 crc kubenswrapper[4734]: E1205 23:39:29.662271 4734 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21a9837d_cc1d_4bf1_9b52_f9196880e367.slice/crio-e3a126dc953224f7cf69d43ba28034a6b6151fba211d245d8ef374996a56a5fa.scope\": RecentStats: unable to find data in memory cache]" Dec 05 23:39:30 crc kubenswrapper[4734]: I1205 23:39:30.188232 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d36cae72-5806-4d9c-80a9-c396c5ca00d6","Type":"ContainerStarted","Data":"391d77f9b1cf83977b2181d851ae19d0500934c3499f890ea47b9c6f0a93f0f2"} Dec 05 23:39:31 crc kubenswrapper[4734]: I1205 23:39:31.262308 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d36cae72-5806-4d9c-80a9-c396c5ca00d6","Type":"ContainerStarted","Data":"ed6b93c1c2ca3afc6c5e49be44118d279481e73c1a76c028f31b344b920a64f4"} Dec 05 23:39:31 crc kubenswrapper[4734]: I1205 23:39:31.297095 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.297072868 podStartE2EDuration="4.297072868s" podCreationTimestamp="2025-12-05 23:39:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:39:31.295246324 +0000 UTC m=+1191.978650600" watchObservedRunningTime="2025-12-05 23:39:31.297072868 +0000 UTC m=+1191.980477144" Dec 05 23:39:31 crc kubenswrapper[4734]: I1205 23:39:31.898706 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5d469948dd-n7t4x" podUID="c96cd173-4707-4edc-a92e-35db297082e2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.105152 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-66674dc5bc-l642k"] Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.108360 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.111202 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.111606 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.111645 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.128437 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-66674dc5bc-l642k"] Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.175411 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d955842c-e3a2-4a05-a380-78c6f2fbdf3b-run-httpd\") pod \"swift-proxy-66674dc5bc-l642k\" (UID: \"d955842c-e3a2-4a05-a380-78c6f2fbdf3b\") " pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.175542 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d955842c-e3a2-4a05-a380-78c6f2fbdf3b-config-data\") pod \"swift-proxy-66674dc5bc-l642k\" (UID: \"d955842c-e3a2-4a05-a380-78c6f2fbdf3b\") " pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.175582 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d955842c-e3a2-4a05-a380-78c6f2fbdf3b-etc-swift\") pod \"swift-proxy-66674dc5bc-l642k\" (UID: \"d955842c-e3a2-4a05-a380-78c6f2fbdf3b\") " pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.175841 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d955842c-e3a2-4a05-a380-78c6f2fbdf3b-internal-tls-certs\") pod \"swift-proxy-66674dc5bc-l642k\" (UID: \"d955842c-e3a2-4a05-a380-78c6f2fbdf3b\") " pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.175920 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d955842c-e3a2-4a05-a380-78c6f2fbdf3b-combined-ca-bundle\") pod \"swift-proxy-66674dc5bc-l642k\" (UID: \"d955842c-e3a2-4a05-a380-78c6f2fbdf3b\") " pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.175951 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d955842c-e3a2-4a05-a380-78c6f2fbdf3b-log-httpd\") pod \"swift-proxy-66674dc5bc-l642k\" (UID: \"d955842c-e3a2-4a05-a380-78c6f2fbdf3b\") " pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.175989 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d955842c-e3a2-4a05-a380-78c6f2fbdf3b-public-tls-certs\") pod \"swift-proxy-66674dc5bc-l642k\" (UID: \"d955842c-e3a2-4a05-a380-78c6f2fbdf3b\") " pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.176095 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwgmb\" (UniqueName: \"kubernetes.io/projected/d955842c-e3a2-4a05-a380-78c6f2fbdf3b-kube-api-access-hwgmb\") pod \"swift-proxy-66674dc5bc-l642k\" (UID: \"d955842c-e3a2-4a05-a380-78c6f2fbdf3b\") " pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.279005 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d955842c-e3a2-4a05-a380-78c6f2fbdf3b-config-data\") pod \"swift-proxy-66674dc5bc-l642k\" (UID: \"d955842c-e3a2-4a05-a380-78c6f2fbdf3b\") " pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.279094 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d955842c-e3a2-4a05-a380-78c6f2fbdf3b-etc-swift\") pod \"swift-proxy-66674dc5bc-l642k\" (UID: \"d955842c-e3a2-4a05-a380-78c6f2fbdf3b\") " pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.279170 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d955842c-e3a2-4a05-a380-78c6f2fbdf3b-internal-tls-certs\") pod \"swift-proxy-66674dc5bc-l642k\" (UID: \"d955842c-e3a2-4a05-a380-78c6f2fbdf3b\") " pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.279197 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d955842c-e3a2-4a05-a380-78c6f2fbdf3b-combined-ca-bundle\") pod \"swift-proxy-66674dc5bc-l642k\" (UID: \"d955842c-e3a2-4a05-a380-78c6f2fbdf3b\") " pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.279219 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d955842c-e3a2-4a05-a380-78c6f2fbdf3b-log-httpd\") pod \"swift-proxy-66674dc5bc-l642k\" (UID: \"d955842c-e3a2-4a05-a380-78c6f2fbdf3b\") " pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.279247 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d955842c-e3a2-4a05-a380-78c6f2fbdf3b-public-tls-certs\") pod \"swift-proxy-66674dc5bc-l642k\" (UID: \"d955842c-e3a2-4a05-a380-78c6f2fbdf3b\") " pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.279301 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwgmb\" (UniqueName: \"kubernetes.io/projected/d955842c-e3a2-4a05-a380-78c6f2fbdf3b-kube-api-access-hwgmb\") pod \"swift-proxy-66674dc5bc-l642k\" (UID: \"d955842c-e3a2-4a05-a380-78c6f2fbdf3b\") " pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.279365 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d955842c-e3a2-4a05-a380-78c6f2fbdf3b-run-httpd\") pod \"swift-proxy-66674dc5bc-l642k\" (UID: \"d955842c-e3a2-4a05-a380-78c6f2fbdf3b\") " pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.280119 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d955842c-e3a2-4a05-a380-78c6f2fbdf3b-run-httpd\") pod \"swift-proxy-66674dc5bc-l642k\" (UID: \"d955842c-e3a2-4a05-a380-78c6f2fbdf3b\") " pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.280435 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d955842c-e3a2-4a05-a380-78c6f2fbdf3b-log-httpd\") pod \"swift-proxy-66674dc5bc-l642k\" (UID: \"d955842c-e3a2-4a05-a380-78c6f2fbdf3b\") " pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.288657 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d955842c-e3a2-4a05-a380-78c6f2fbdf3b-internal-tls-certs\") pod \"swift-proxy-66674dc5bc-l642k\" (UID: \"d955842c-e3a2-4a05-a380-78c6f2fbdf3b\") " pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.289158 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d955842c-e3a2-4a05-a380-78c6f2fbdf3b-public-tls-certs\") pod \"swift-proxy-66674dc5bc-l642k\" (UID: \"d955842c-e3a2-4a05-a380-78c6f2fbdf3b\") " pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.291263 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d955842c-e3a2-4a05-a380-78c6f2fbdf3b-etc-swift\") pod \"swift-proxy-66674dc5bc-l642k\" (UID: \"d955842c-e3a2-4a05-a380-78c6f2fbdf3b\") " pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.291277 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d955842c-e3a2-4a05-a380-78c6f2fbdf3b-config-data\") pod \"swift-proxy-66674dc5bc-l642k\" (UID: \"d955842c-e3a2-4a05-a380-78c6f2fbdf3b\") " pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.309995 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d955842c-e3a2-4a05-a380-78c6f2fbdf3b-combined-ca-bundle\") pod \"swift-proxy-66674dc5bc-l642k\" (UID: \"d955842c-e3a2-4a05-a380-78c6f2fbdf3b\") " pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.321180 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwgmb\" (UniqueName: \"kubernetes.io/projected/d955842c-e3a2-4a05-a380-78c6f2fbdf3b-kube-api-access-hwgmb\") pod \"swift-proxy-66674dc5bc-l642k\" (UID: \"d955842c-e3a2-4a05-a380-78c6f2fbdf3b\") " pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.433921 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.613295 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 05 23:39:32 crc kubenswrapper[4734]: I1205 23:39:32.913881 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75b7946cc8-hzcp7" Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.004495 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfsth\" (UniqueName: \"kubernetes.io/projected/4093fb52-8433-4e30-9a08-9fc77fb5d49e-kube-api-access-kfsth\") pod \"4093fb52-8433-4e30-9a08-9fc77fb5d49e\" (UID: \"4093fb52-8433-4e30-9a08-9fc77fb5d49e\") " Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.005158 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4093fb52-8433-4e30-9a08-9fc77fb5d49e-logs\") pod \"4093fb52-8433-4e30-9a08-9fc77fb5d49e\" (UID: \"4093fb52-8433-4e30-9a08-9fc77fb5d49e\") " Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.005254 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4093fb52-8433-4e30-9a08-9fc77fb5d49e-config-data-custom\") pod \"4093fb52-8433-4e30-9a08-9fc77fb5d49e\" (UID: \"4093fb52-8433-4e30-9a08-9fc77fb5d49e\") " Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.005297 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4093fb52-8433-4e30-9a08-9fc77fb5d49e-combined-ca-bundle\") pod \"4093fb52-8433-4e30-9a08-9fc77fb5d49e\" (UID: \"4093fb52-8433-4e30-9a08-9fc77fb5d49e\") " Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.005320 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4093fb52-8433-4e30-9a08-9fc77fb5d49e-config-data\") pod \"4093fb52-8433-4e30-9a08-9fc77fb5d49e\" (UID: \"4093fb52-8433-4e30-9a08-9fc77fb5d49e\") " Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.006345 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4093fb52-8433-4e30-9a08-9fc77fb5d49e-logs" (OuterVolumeSpecName: "logs") pod "4093fb52-8433-4e30-9a08-9fc77fb5d49e" (UID: "4093fb52-8433-4e30-9a08-9fc77fb5d49e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.023343 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4093fb52-8433-4e30-9a08-9fc77fb5d49e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4093fb52-8433-4e30-9a08-9fc77fb5d49e" (UID: "4093fb52-8433-4e30-9a08-9fc77fb5d49e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.028795 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4093fb52-8433-4e30-9a08-9fc77fb5d49e-kube-api-access-kfsth" (OuterVolumeSpecName: "kube-api-access-kfsth") pod "4093fb52-8433-4e30-9a08-9fc77fb5d49e" (UID: "4093fb52-8433-4e30-9a08-9fc77fb5d49e"). InnerVolumeSpecName "kube-api-access-kfsth". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.044728 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4093fb52-8433-4e30-9a08-9fc77fb5d49e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4093fb52-8433-4e30-9a08-9fc77fb5d49e" (UID: "4093fb52-8433-4e30-9a08-9fc77fb5d49e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.079716 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4093fb52-8433-4e30-9a08-9fc77fb5d49e-config-data" (OuterVolumeSpecName: "config-data") pod "4093fb52-8433-4e30-9a08-9fc77fb5d49e" (UID: "4093fb52-8433-4e30-9a08-9fc77fb5d49e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.108170 4734 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4093fb52-8433-4e30-9a08-9fc77fb5d49e-logs\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.108230 4734 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4093fb52-8433-4e30-9a08-9fc77fb5d49e-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.108244 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4093fb52-8433-4e30-9a08-9fc77fb5d49e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.108257 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4093fb52-8433-4e30-9a08-9fc77fb5d49e-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.108268 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfsth\" (UniqueName: \"kubernetes.io/projected/4093fb52-8433-4e30-9a08-9fc77fb5d49e-kube-api-access-kfsth\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.177154 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-66674dc5bc-l642k"] Dec 05 23:39:33 crc kubenswrapper[4734]: W1205 23:39:33.182617 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd955842c_e3a2_4a05_a380_78c6f2fbdf3b.slice/crio-1c9f61dac52bb715f2ee683302a0874f51b11a536098dc3cb210e5cf012069af WatchSource:0}: Error finding container 1c9f61dac52bb715f2ee683302a0874f51b11a536098dc3cb210e5cf012069af: Status 404 returned error can't find the container with id 1c9f61dac52bb715f2ee683302a0874f51b11a536098dc3cb210e5cf012069af Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.318090 4734 generic.go:334] "Generic (PLEG): container finished" podID="4093fb52-8433-4e30-9a08-9fc77fb5d49e" containerID="130367aed39c766db24d0825b622516d6a6256618ee3738e9e1354f424f5a5a4" exitCode=0 Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.318222 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75b7946cc8-hzcp7" Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.318258 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75b7946cc8-hzcp7" event={"ID":"4093fb52-8433-4e30-9a08-9fc77fb5d49e","Type":"ContainerDied","Data":"130367aed39c766db24d0825b622516d6a6256618ee3738e9e1354f424f5a5a4"} Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.319316 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75b7946cc8-hzcp7" event={"ID":"4093fb52-8433-4e30-9a08-9fc77fb5d49e","Type":"ContainerDied","Data":"4fd3041ce1d84d198169382e76509c859b64138831cc3374cc9dd8d9be045393"} Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.319344 4734 scope.go:117] "RemoveContainer" containerID="130367aed39c766db24d0825b622516d6a6256618ee3738e9e1354f424f5a5a4" Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.334713 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-66674dc5bc-l642k" event={"ID":"d955842c-e3a2-4a05-a380-78c6f2fbdf3b","Type":"ContainerStarted","Data":"1c9f61dac52bb715f2ee683302a0874f51b11a536098dc3cb210e5cf012069af"} Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.363203 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-75b7946cc8-hzcp7"] Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.373497 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-75b7946cc8-hzcp7"] Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.382138 4734 scope.go:117] "RemoveContainer" containerID="bd514e4b2369b6ed51bbdc649f68e8ac1b180b1bf9b9530da65841f4c089923a" Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.417051 4734 scope.go:117] "RemoveContainer" containerID="130367aed39c766db24d0825b622516d6a6256618ee3738e9e1354f424f5a5a4" Dec 05 23:39:33 crc kubenswrapper[4734]: E1205 23:39:33.418002 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"130367aed39c766db24d0825b622516d6a6256618ee3738e9e1354f424f5a5a4\": container with ID starting with 130367aed39c766db24d0825b622516d6a6256618ee3738e9e1354f424f5a5a4 not found: ID does not exist" containerID="130367aed39c766db24d0825b622516d6a6256618ee3738e9e1354f424f5a5a4" Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.418059 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"130367aed39c766db24d0825b622516d6a6256618ee3738e9e1354f424f5a5a4"} err="failed to get container status \"130367aed39c766db24d0825b622516d6a6256618ee3738e9e1354f424f5a5a4\": rpc error: code = NotFound desc = could not find container \"130367aed39c766db24d0825b622516d6a6256618ee3738e9e1354f424f5a5a4\": container with ID starting with 130367aed39c766db24d0825b622516d6a6256618ee3738e9e1354f424f5a5a4 not found: ID does not exist" Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.418083 4734 scope.go:117] "RemoveContainer" containerID="bd514e4b2369b6ed51bbdc649f68e8ac1b180b1bf9b9530da65841f4c089923a" Dec 05 23:39:33 crc kubenswrapper[4734]: E1205 23:39:33.420038 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd514e4b2369b6ed51bbdc649f68e8ac1b180b1bf9b9530da65841f4c089923a\": container with ID starting with bd514e4b2369b6ed51bbdc649f68e8ac1b180b1bf9b9530da65841f4c089923a not found: ID does not exist" containerID="bd514e4b2369b6ed51bbdc649f68e8ac1b180b1bf9b9530da65841f4c089923a" Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.420061 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd514e4b2369b6ed51bbdc649f68e8ac1b180b1bf9b9530da65841f4c089923a"} err="failed to get container status \"bd514e4b2369b6ed51bbdc649f68e8ac1b180b1bf9b9530da65841f4c089923a\": rpc error: code = NotFound desc = could not find container \"bd514e4b2369b6ed51bbdc649f68e8ac1b180b1bf9b9530da65841f4c089923a\": container with ID starting with bd514e4b2369b6ed51bbdc649f68e8ac1b180b1bf9b9530da65841f4c089923a not found: ID does not exist" Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.630880 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4093fb52-8433-4e30-9a08-9fc77fb5d49e" path="/var/lib/kubelet/pods/4093fb52-8433-4e30-9a08-9fc77fb5d49e/volumes" Dec 05 23:39:33 crc kubenswrapper[4734]: I1205 23:39:33.660043 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 05 23:39:34 crc kubenswrapper[4734]: I1205 23:39:34.359010 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-66674dc5bc-l642k" event={"ID":"d955842c-e3a2-4a05-a380-78c6f2fbdf3b","Type":"ContainerStarted","Data":"38db8781f4d366e092bfefd6ba153e2530f4e1a39aa16e11f7bc45032ee7cef3"} Dec 05 23:39:34 crc kubenswrapper[4734]: I1205 23:39:34.359487 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-66674dc5bc-l642k" event={"ID":"d955842c-e3a2-4a05-a380-78c6f2fbdf3b","Type":"ContainerStarted","Data":"30469553bdad9a34338b660fe2672d95f9ddb2faf71955dd3ec572dc0f979637"} Dec 05 23:39:35 crc kubenswrapper[4734]: I1205 23:39:35.373342 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:36 crc kubenswrapper[4734]: I1205 23:39:36.389198 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:37 crc kubenswrapper[4734]: I1205 23:39:37.183377 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-66674dc5bc-l642k" podStartSLOduration=5.183343438 podStartE2EDuration="5.183343438s" podCreationTimestamp="2025-12-05 23:39:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:39:35.411771688 +0000 UTC m=+1196.095175964" watchObservedRunningTime="2025-12-05 23:39:37.183343438 +0000 UTC m=+1197.866747714" Dec 05 23:39:37 crc kubenswrapper[4734]: I1205 23:39:37.184925 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:39:37 crc kubenswrapper[4734]: I1205 23:39:37.185435 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe03fe87-03d0-45aa-a054-5e991c765ccc" containerName="ceilometer-central-agent" containerID="cri-o://eb6ec676712aee3b9d54d8188fc659aaed58c25d07c5471140bac8f9492272a9" gracePeriod=30 Dec 05 23:39:37 crc kubenswrapper[4734]: I1205 23:39:37.186216 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe03fe87-03d0-45aa-a054-5e991c765ccc" containerName="sg-core" containerID="cri-o://b531e588ce7410be24b22419fc98c0ca55459c394dd280ee194826305e09d19c" gracePeriod=30 Dec 05 23:39:37 crc kubenswrapper[4734]: I1205 23:39:37.186310 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe03fe87-03d0-45aa-a054-5e991c765ccc" containerName="ceilometer-notification-agent" containerID="cri-o://bb470f46520236c5eab4dd981855f588aac943423952865089e1166f6e31a918" gracePeriod=30 Dec 05 23:39:37 crc kubenswrapper[4734]: I1205 23:39:37.186381 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe03fe87-03d0-45aa-a054-5e991c765ccc" containerName="proxy-httpd" containerID="cri-o://8eb3419c100e18d602d17e80460f5aa327255183e72309669de08e8b7228eb7b" gracePeriod=30 Dec 05 23:39:37 crc kubenswrapper[4734]: I1205 23:39:37.201844 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 05 23:39:37 crc kubenswrapper[4734]: I1205 23:39:37.406177 4734 generic.go:334] "Generic (PLEG): container finished" podID="fe03fe87-03d0-45aa-a054-5e991c765ccc" containerID="b531e588ce7410be24b22419fc98c0ca55459c394dd280ee194826305e09d19c" exitCode=2 Dec 05 23:39:37 crc kubenswrapper[4734]: I1205 23:39:37.406276 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe03fe87-03d0-45aa-a054-5e991c765ccc","Type":"ContainerDied","Data":"b531e588ce7410be24b22419fc98c0ca55459c394dd280ee194826305e09d19c"} Dec 05 23:39:38 crc kubenswrapper[4734]: I1205 23:39:38.106463 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 05 23:39:38 crc kubenswrapper[4734]: I1205 23:39:38.423432 4734 generic.go:334] "Generic (PLEG): container finished" podID="fe03fe87-03d0-45aa-a054-5e991c765ccc" containerID="8eb3419c100e18d602d17e80460f5aa327255183e72309669de08e8b7228eb7b" exitCode=0 Dec 05 23:39:38 crc kubenswrapper[4734]: I1205 23:39:38.423908 4734 generic.go:334] "Generic (PLEG): container finished" podID="fe03fe87-03d0-45aa-a054-5e991c765ccc" containerID="eb6ec676712aee3b9d54d8188fc659aaed58c25d07c5471140bac8f9492272a9" exitCode=0 Dec 05 23:39:38 crc kubenswrapper[4734]: I1205 23:39:38.423940 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe03fe87-03d0-45aa-a054-5e991c765ccc","Type":"ContainerDied","Data":"8eb3419c100e18d602d17e80460f5aa327255183e72309669de08e8b7228eb7b"} Dec 05 23:39:38 crc kubenswrapper[4734]: I1205 23:39:38.423975 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe03fe87-03d0-45aa-a054-5e991c765ccc","Type":"ContainerDied","Data":"eb6ec676712aee3b9d54d8188fc659aaed58c25d07c5471140bac8f9492272a9"} Dec 05 23:39:38 crc kubenswrapper[4734]: I1205 23:39:38.517576 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="fe03fe87-03d0-45aa-a054-5e991c765ccc" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.157:3000/\": dial tcp 10.217.0.157:3000: connect: connection refused" Dec 05 23:39:40 crc kubenswrapper[4734]: I1205 23:39:40.451086 4734 generic.go:334] "Generic (PLEG): container finished" podID="fe03fe87-03d0-45aa-a054-5e991c765ccc" containerID="bb470f46520236c5eab4dd981855f588aac943423952865089e1166f6e31a918" exitCode=0 Dec 05 23:39:40 crc kubenswrapper[4734]: I1205 23:39:40.451185 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe03fe87-03d0-45aa-a054-5e991c765ccc","Type":"ContainerDied","Data":"bb470f46520236c5eab4dd981855f588aac943423952865089e1166f6e31a918"} Dec 05 23:39:41 crc kubenswrapper[4734]: I1205 23:39:41.900888 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5d469948dd-n7t4x" podUID="c96cd173-4707-4edc-a92e-35db297082e2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Dec 05 23:39:41 crc kubenswrapper[4734]: I1205 23:39:41.901076 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:39:42 crc kubenswrapper[4734]: I1205 23:39:42.445198 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:42 crc kubenswrapper[4734]: I1205 23:39:42.451911 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-66674dc5bc-l642k" Dec 05 23:39:43 crc kubenswrapper[4734]: I1205 23:39:43.651048 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 23:39:43 crc kubenswrapper[4734]: I1205 23:39:43.781480 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe03fe87-03d0-45aa-a054-5e991c765ccc-scripts\") pod \"fe03fe87-03d0-45aa-a054-5e991c765ccc\" (UID: \"fe03fe87-03d0-45aa-a054-5e991c765ccc\") " Dec 05 23:39:43 crc kubenswrapper[4734]: I1205 23:39:43.781585 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe03fe87-03d0-45aa-a054-5e991c765ccc-sg-core-conf-yaml\") pod \"fe03fe87-03d0-45aa-a054-5e991c765ccc\" (UID: \"fe03fe87-03d0-45aa-a054-5e991c765ccc\") " Dec 05 23:39:43 crc kubenswrapper[4734]: I1205 23:39:43.781614 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe03fe87-03d0-45aa-a054-5e991c765ccc-combined-ca-bundle\") pod \"fe03fe87-03d0-45aa-a054-5e991c765ccc\" (UID: \"fe03fe87-03d0-45aa-a054-5e991c765ccc\") " Dec 05 23:39:43 crc kubenswrapper[4734]: I1205 23:39:43.781664 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rq4t4\" (UniqueName: \"kubernetes.io/projected/fe03fe87-03d0-45aa-a054-5e991c765ccc-kube-api-access-rq4t4\") pod \"fe03fe87-03d0-45aa-a054-5e991c765ccc\" (UID: \"fe03fe87-03d0-45aa-a054-5e991c765ccc\") " Dec 05 23:39:43 crc kubenswrapper[4734]: I1205 23:39:43.781702 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe03fe87-03d0-45aa-a054-5e991c765ccc-config-data\") pod \"fe03fe87-03d0-45aa-a054-5e991c765ccc\" (UID: \"fe03fe87-03d0-45aa-a054-5e991c765ccc\") " Dec 05 23:39:43 crc kubenswrapper[4734]: I1205 23:39:43.781756 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe03fe87-03d0-45aa-a054-5e991c765ccc-run-httpd\") pod \"fe03fe87-03d0-45aa-a054-5e991c765ccc\" (UID: \"fe03fe87-03d0-45aa-a054-5e991c765ccc\") " Dec 05 23:39:43 crc kubenswrapper[4734]: I1205 23:39:43.781988 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe03fe87-03d0-45aa-a054-5e991c765ccc-log-httpd\") pod \"fe03fe87-03d0-45aa-a054-5e991c765ccc\" (UID: \"fe03fe87-03d0-45aa-a054-5e991c765ccc\") " Dec 05 23:39:43 crc kubenswrapper[4734]: I1205 23:39:43.782252 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe03fe87-03d0-45aa-a054-5e991c765ccc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fe03fe87-03d0-45aa-a054-5e991c765ccc" (UID: "fe03fe87-03d0-45aa-a054-5e991c765ccc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:39:43 crc kubenswrapper[4734]: I1205 23:39:43.782518 4734 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe03fe87-03d0-45aa-a054-5e991c765ccc-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:43 crc kubenswrapper[4734]: I1205 23:39:43.782704 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe03fe87-03d0-45aa-a054-5e991c765ccc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fe03fe87-03d0-45aa-a054-5e991c765ccc" (UID: "fe03fe87-03d0-45aa-a054-5e991c765ccc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:39:43 crc kubenswrapper[4734]: I1205 23:39:43.787390 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe03fe87-03d0-45aa-a054-5e991c765ccc-scripts" (OuterVolumeSpecName: "scripts") pod "fe03fe87-03d0-45aa-a054-5e991c765ccc" (UID: "fe03fe87-03d0-45aa-a054-5e991c765ccc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:43 crc kubenswrapper[4734]: I1205 23:39:43.799551 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe03fe87-03d0-45aa-a054-5e991c765ccc-kube-api-access-rq4t4" (OuterVolumeSpecName: "kube-api-access-rq4t4") pod "fe03fe87-03d0-45aa-a054-5e991c765ccc" (UID: "fe03fe87-03d0-45aa-a054-5e991c765ccc"). InnerVolumeSpecName "kube-api-access-rq4t4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:39:43 crc kubenswrapper[4734]: I1205 23:39:43.829820 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe03fe87-03d0-45aa-a054-5e991c765ccc-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fe03fe87-03d0-45aa-a054-5e991c765ccc" (UID: "fe03fe87-03d0-45aa-a054-5e991c765ccc"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:43 crc kubenswrapper[4734]: I1205 23:39:43.876479 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe03fe87-03d0-45aa-a054-5e991c765ccc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe03fe87-03d0-45aa-a054-5e991c765ccc" (UID: "fe03fe87-03d0-45aa-a054-5e991c765ccc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:43 crc kubenswrapper[4734]: I1205 23:39:43.884699 4734 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe03fe87-03d0-45aa-a054-5e991c765ccc-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:43 crc kubenswrapper[4734]: I1205 23:39:43.884762 4734 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe03fe87-03d0-45aa-a054-5e991c765ccc-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:43 crc kubenswrapper[4734]: I1205 23:39:43.884777 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe03fe87-03d0-45aa-a054-5e991c765ccc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:43 crc kubenswrapper[4734]: I1205 23:39:43.884788 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rq4t4\" (UniqueName: \"kubernetes.io/projected/fe03fe87-03d0-45aa-a054-5e991c765ccc-kube-api-access-rq4t4\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:43 crc kubenswrapper[4734]: I1205 23:39:43.884798 4734 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe03fe87-03d0-45aa-a054-5e991c765ccc-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:43 crc kubenswrapper[4734]: I1205 23:39:43.927816 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe03fe87-03d0-45aa-a054-5e991c765ccc-config-data" (OuterVolumeSpecName: "config-data") pod "fe03fe87-03d0-45aa-a054-5e991c765ccc" (UID: "fe03fe87-03d0-45aa-a054-5e991c765ccc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:43 crc kubenswrapper[4734]: I1205 23:39:43.986998 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe03fe87-03d0-45aa-a054-5e991c765ccc-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.495798 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe03fe87-03d0-45aa-a054-5e991c765ccc","Type":"ContainerDied","Data":"359b329f911140597764294baedc06b72d5d3a63e4ea959d255a3b1634106471"} Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.496282 4734 scope.go:117] "RemoveContainer" containerID="8eb3419c100e18d602d17e80460f5aa327255183e72309669de08e8b7228eb7b" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.495847 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.498233 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"1538ece1-e24d-4f20-b92d-0b526d1f5698","Type":"ContainerStarted","Data":"8e119feb1426ccd8899b7697d51cbc53625c69323a8d7a43d9da14bc0d6c6507"} Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.519303 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.580184368 podStartE2EDuration="19.519281157s" podCreationTimestamp="2025-12-05 23:39:25 +0000 UTC" firstStartedPulling="2025-12-05 23:39:26.393085271 +0000 UTC m=+1187.076489547" lastFinishedPulling="2025-12-05 23:39:43.33218206 +0000 UTC m=+1204.015586336" observedRunningTime="2025-12-05 23:39:44.517227897 +0000 UTC m=+1205.200632173" watchObservedRunningTime="2025-12-05 23:39:44.519281157 +0000 UTC m=+1205.202685433" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.526320 4734 scope.go:117] "RemoveContainer" containerID="b531e588ce7410be24b22419fc98c0ca55459c394dd280ee194826305e09d19c" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.549217 4734 scope.go:117] "RemoveContainer" containerID="bb470f46520236c5eab4dd981855f588aac943423952865089e1166f6e31a918" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.557178 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.572883 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.573852 4734 scope.go:117] "RemoveContainer" containerID="eb6ec676712aee3b9d54d8188fc659aaed58c25d07c5471140bac8f9492272a9" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.601435 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:39:44 crc kubenswrapper[4734]: E1205 23:39:44.601933 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4093fb52-8433-4e30-9a08-9fc77fb5d49e" containerName="barbican-api-log" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.601953 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="4093fb52-8433-4e30-9a08-9fc77fb5d49e" containerName="barbican-api-log" Dec 05 23:39:44 crc kubenswrapper[4734]: E1205 23:39:44.601981 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe03fe87-03d0-45aa-a054-5e991c765ccc" containerName="proxy-httpd" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.601988 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe03fe87-03d0-45aa-a054-5e991c765ccc" containerName="proxy-httpd" Dec 05 23:39:44 crc kubenswrapper[4734]: E1205 23:39:44.602006 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4093fb52-8433-4e30-9a08-9fc77fb5d49e" containerName="barbican-api" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.602013 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="4093fb52-8433-4e30-9a08-9fc77fb5d49e" containerName="barbican-api" Dec 05 23:39:44 crc kubenswrapper[4734]: E1205 23:39:44.602029 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe03fe87-03d0-45aa-a054-5e991c765ccc" containerName="sg-core" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.602035 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe03fe87-03d0-45aa-a054-5e991c765ccc" containerName="sg-core" Dec 05 23:39:44 crc kubenswrapper[4734]: E1205 23:39:44.602047 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe03fe87-03d0-45aa-a054-5e991c765ccc" containerName="ceilometer-notification-agent" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.602054 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe03fe87-03d0-45aa-a054-5e991c765ccc" containerName="ceilometer-notification-agent" Dec 05 23:39:44 crc kubenswrapper[4734]: E1205 23:39:44.602072 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe03fe87-03d0-45aa-a054-5e991c765ccc" containerName="ceilometer-central-agent" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.602078 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe03fe87-03d0-45aa-a054-5e991c765ccc" containerName="ceilometer-central-agent" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.602248 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe03fe87-03d0-45aa-a054-5e991c765ccc" containerName="proxy-httpd" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.602269 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe03fe87-03d0-45aa-a054-5e991c765ccc" containerName="ceilometer-notification-agent" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.602280 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe03fe87-03d0-45aa-a054-5e991c765ccc" containerName="sg-core" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.602300 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe03fe87-03d0-45aa-a054-5e991c765ccc" containerName="ceilometer-central-agent" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.602313 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="4093fb52-8433-4e30-9a08-9fc77fb5d49e" containerName="barbican-api" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.602326 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="4093fb52-8433-4e30-9a08-9fc77fb5d49e" containerName="barbican-api-log" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.608773 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.613376 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.613666 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.617507 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.711959 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8e743d2-607c-4443-ac08-bb87e44e6f00-run-httpd\") pod \"ceilometer-0\" (UID: \"a8e743d2-607c-4443-ac08-bb87e44e6f00\") " pod="openstack/ceilometer-0" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.712085 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8e743d2-607c-4443-ac08-bb87e44e6f00-log-httpd\") pod \"ceilometer-0\" (UID: \"a8e743d2-607c-4443-ac08-bb87e44e6f00\") " pod="openstack/ceilometer-0" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.712151 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8e743d2-607c-4443-ac08-bb87e44e6f00-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a8e743d2-607c-4443-ac08-bb87e44e6f00\") " pod="openstack/ceilometer-0" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.712208 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8e743d2-607c-4443-ac08-bb87e44e6f00-scripts\") pod \"ceilometer-0\" (UID: \"a8e743d2-607c-4443-ac08-bb87e44e6f00\") " pod="openstack/ceilometer-0" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.712250 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll5bj\" (UniqueName: \"kubernetes.io/projected/a8e743d2-607c-4443-ac08-bb87e44e6f00-kube-api-access-ll5bj\") pod \"ceilometer-0\" (UID: \"a8e743d2-607c-4443-ac08-bb87e44e6f00\") " pod="openstack/ceilometer-0" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.712309 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8e743d2-607c-4443-ac08-bb87e44e6f00-config-data\") pod \"ceilometer-0\" (UID: \"a8e743d2-607c-4443-ac08-bb87e44e6f00\") " pod="openstack/ceilometer-0" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.712344 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a8e743d2-607c-4443-ac08-bb87e44e6f00-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a8e743d2-607c-4443-ac08-bb87e44e6f00\") " pod="openstack/ceilometer-0" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.815232 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8e743d2-607c-4443-ac08-bb87e44e6f00-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a8e743d2-607c-4443-ac08-bb87e44e6f00\") " pod="openstack/ceilometer-0" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.815384 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8e743d2-607c-4443-ac08-bb87e44e6f00-scripts\") pod \"ceilometer-0\" (UID: \"a8e743d2-607c-4443-ac08-bb87e44e6f00\") " pod="openstack/ceilometer-0" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.815447 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll5bj\" (UniqueName: \"kubernetes.io/projected/a8e743d2-607c-4443-ac08-bb87e44e6f00-kube-api-access-ll5bj\") pod \"ceilometer-0\" (UID: \"a8e743d2-607c-4443-ac08-bb87e44e6f00\") " pod="openstack/ceilometer-0" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.815575 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8e743d2-607c-4443-ac08-bb87e44e6f00-config-data\") pod \"ceilometer-0\" (UID: \"a8e743d2-607c-4443-ac08-bb87e44e6f00\") " pod="openstack/ceilometer-0" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.815635 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a8e743d2-607c-4443-ac08-bb87e44e6f00-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a8e743d2-607c-4443-ac08-bb87e44e6f00\") " pod="openstack/ceilometer-0" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.815804 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8e743d2-607c-4443-ac08-bb87e44e6f00-run-httpd\") pod \"ceilometer-0\" (UID: \"a8e743d2-607c-4443-ac08-bb87e44e6f00\") " pod="openstack/ceilometer-0" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.815887 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8e743d2-607c-4443-ac08-bb87e44e6f00-log-httpd\") pod \"ceilometer-0\" (UID: \"a8e743d2-607c-4443-ac08-bb87e44e6f00\") " pod="openstack/ceilometer-0" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.816803 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8e743d2-607c-4443-ac08-bb87e44e6f00-log-httpd\") pod \"ceilometer-0\" (UID: \"a8e743d2-607c-4443-ac08-bb87e44e6f00\") " pod="openstack/ceilometer-0" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.817030 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8e743d2-607c-4443-ac08-bb87e44e6f00-run-httpd\") pod \"ceilometer-0\" (UID: \"a8e743d2-607c-4443-ac08-bb87e44e6f00\") " pod="openstack/ceilometer-0" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.821934 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8e743d2-607c-4443-ac08-bb87e44e6f00-scripts\") pod \"ceilometer-0\" (UID: \"a8e743d2-607c-4443-ac08-bb87e44e6f00\") " pod="openstack/ceilometer-0" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.822425 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8e743d2-607c-4443-ac08-bb87e44e6f00-config-data\") pod \"ceilometer-0\" (UID: \"a8e743d2-607c-4443-ac08-bb87e44e6f00\") " pod="openstack/ceilometer-0" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.823059 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8e743d2-607c-4443-ac08-bb87e44e6f00-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a8e743d2-607c-4443-ac08-bb87e44e6f00\") " pod="openstack/ceilometer-0" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.825094 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a8e743d2-607c-4443-ac08-bb87e44e6f00-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a8e743d2-607c-4443-ac08-bb87e44e6f00\") " pod="openstack/ceilometer-0" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.833853 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll5bj\" (UniqueName: \"kubernetes.io/projected/a8e743d2-607c-4443-ac08-bb87e44e6f00-kube-api-access-ll5bj\") pod \"ceilometer-0\" (UID: \"a8e743d2-607c-4443-ac08-bb87e44e6f00\") " pod="openstack/ceilometer-0" Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.897779 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.898092 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="4cf3a204-9b47-4206-9964-deb892777324" containerName="kube-state-metrics" containerID="cri-o://a50fc2d887cea8301fb9cb505ce53e804ed8f68a83f980a2b3e0fd30af384a43" gracePeriod=30 Dec 05 23:39:44 crc kubenswrapper[4734]: I1205 23:39:44.935673 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 23:39:45 crc kubenswrapper[4734]: I1205 23:39:45.517386 4734 generic.go:334] "Generic (PLEG): container finished" podID="4cf3a204-9b47-4206-9964-deb892777324" containerID="a50fc2d887cea8301fb9cb505ce53e804ed8f68a83f980a2b3e0fd30af384a43" exitCode=2 Dec 05 23:39:45 crc kubenswrapper[4734]: I1205 23:39:45.517749 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4cf3a204-9b47-4206-9964-deb892777324","Type":"ContainerDied","Data":"a50fc2d887cea8301fb9cb505ce53e804ed8f68a83f980a2b3e0fd30af384a43"} Dec 05 23:39:45 crc kubenswrapper[4734]: I1205 23:39:45.562008 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:39:45 crc kubenswrapper[4734]: W1205 23:39:45.567343 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8e743d2_607c_4443_ac08_bb87e44e6f00.slice/crio-814db3b7e2072e8e85ed4979937c44cc9386e9d0afb9b9c1b9cf95a0563b038e WatchSource:0}: Error finding container 814db3b7e2072e8e85ed4979937c44cc9386e9d0afb9b9c1b9cf95a0563b038e: Status 404 returned error can't find the container with id 814db3b7e2072e8e85ed4979937c44cc9386e9d0afb9b9c1b9cf95a0563b038e Dec 05 23:39:45 crc kubenswrapper[4734]: I1205 23:39:45.627513 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe03fe87-03d0-45aa-a054-5e991c765ccc" path="/var/lib/kubelet/pods/fe03fe87-03d0-45aa-a054-5e991c765ccc/volumes" Dec 05 23:39:45 crc kubenswrapper[4734]: I1205 23:39:45.651164 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 23:39:45 crc kubenswrapper[4734]: I1205 23:39:45.754052 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fmwq\" (UniqueName: \"kubernetes.io/projected/4cf3a204-9b47-4206-9964-deb892777324-kube-api-access-4fmwq\") pod \"4cf3a204-9b47-4206-9964-deb892777324\" (UID: \"4cf3a204-9b47-4206-9964-deb892777324\") " Dec 05 23:39:45 crc kubenswrapper[4734]: I1205 23:39:45.766634 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cf3a204-9b47-4206-9964-deb892777324-kube-api-access-4fmwq" (OuterVolumeSpecName: "kube-api-access-4fmwq") pod "4cf3a204-9b47-4206-9964-deb892777324" (UID: "4cf3a204-9b47-4206-9964-deb892777324"). InnerVolumeSpecName "kube-api-access-4fmwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:39:45 crc kubenswrapper[4734]: I1205 23:39:45.856728 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fmwq\" (UniqueName: \"kubernetes.io/projected/4cf3a204-9b47-4206-9964-deb892777324-kube-api-access-4fmwq\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:46 crc kubenswrapper[4734]: I1205 23:39:46.529469 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8e743d2-607c-4443-ac08-bb87e44e6f00","Type":"ContainerStarted","Data":"c112d4a7158b33b0ce4ad10dab55450daa9761583fb37858bf4806aa0b50ac2f"} Dec 05 23:39:46 crc kubenswrapper[4734]: I1205 23:39:46.529985 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8e743d2-607c-4443-ac08-bb87e44e6f00","Type":"ContainerStarted","Data":"814db3b7e2072e8e85ed4979937c44cc9386e9d0afb9b9c1b9cf95a0563b038e"} Dec 05 23:39:46 crc kubenswrapper[4734]: I1205 23:39:46.532287 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4cf3a204-9b47-4206-9964-deb892777324","Type":"ContainerDied","Data":"2c9738390b425ae10b5dae6687a99e02ab70086f7e3cfae786cd759a5d285b40"} Dec 05 23:39:46 crc kubenswrapper[4734]: I1205 23:39:46.532328 4734 scope.go:117] "RemoveContainer" containerID="a50fc2d887cea8301fb9cb505ce53e804ed8f68a83f980a2b3e0fd30af384a43" Dec 05 23:39:46 crc kubenswrapper[4734]: I1205 23:39:46.532418 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 23:39:46 crc kubenswrapper[4734]: I1205 23:39:46.615059 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 23:39:46 crc kubenswrapper[4734]: I1205 23:39:46.632156 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 23:39:46 crc kubenswrapper[4734]: I1205 23:39:46.641759 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 23:39:46 crc kubenswrapper[4734]: E1205 23:39:46.642462 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cf3a204-9b47-4206-9964-deb892777324" containerName="kube-state-metrics" Dec 05 23:39:46 crc kubenswrapper[4734]: I1205 23:39:46.642487 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cf3a204-9b47-4206-9964-deb892777324" containerName="kube-state-metrics" Dec 05 23:39:46 crc kubenswrapper[4734]: I1205 23:39:46.642756 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cf3a204-9b47-4206-9964-deb892777324" containerName="kube-state-metrics" Dec 05 23:39:46 crc kubenswrapper[4734]: I1205 23:39:46.643762 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 23:39:46 crc kubenswrapper[4734]: I1205 23:39:46.649000 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 05 23:39:46 crc kubenswrapper[4734]: I1205 23:39:46.649319 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 05 23:39:46 crc kubenswrapper[4734]: I1205 23:39:46.650686 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 23:39:46 crc kubenswrapper[4734]: I1205 23:39:46.781228 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e4c76f3a-43a9-43fc-be28-d7d3081d5e39-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e4c76f3a-43a9-43fc-be28-d7d3081d5e39\") " pod="openstack/kube-state-metrics-0" Dec 05 23:39:46 crc kubenswrapper[4734]: I1205 23:39:46.781299 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c76f3a-43a9-43fc-be28-d7d3081d5e39-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e4c76f3a-43a9-43fc-be28-d7d3081d5e39\") " pod="openstack/kube-state-metrics-0" Dec 05 23:39:46 crc kubenswrapper[4734]: I1205 23:39:46.781360 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4c76f3a-43a9-43fc-be28-d7d3081d5e39-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e4c76f3a-43a9-43fc-be28-d7d3081d5e39\") " pod="openstack/kube-state-metrics-0" Dec 05 23:39:46 crc kubenswrapper[4734]: I1205 23:39:46.781479 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2hv5\" (UniqueName: \"kubernetes.io/projected/e4c76f3a-43a9-43fc-be28-d7d3081d5e39-kube-api-access-z2hv5\") pod \"kube-state-metrics-0\" (UID: \"e4c76f3a-43a9-43fc-be28-d7d3081d5e39\") " pod="openstack/kube-state-metrics-0" Dec 05 23:39:46 crc kubenswrapper[4734]: I1205 23:39:46.883026 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2hv5\" (UniqueName: \"kubernetes.io/projected/e4c76f3a-43a9-43fc-be28-d7d3081d5e39-kube-api-access-z2hv5\") pod \"kube-state-metrics-0\" (UID: \"e4c76f3a-43a9-43fc-be28-d7d3081d5e39\") " pod="openstack/kube-state-metrics-0" Dec 05 23:39:46 crc kubenswrapper[4734]: I1205 23:39:46.883567 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e4c76f3a-43a9-43fc-be28-d7d3081d5e39-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e4c76f3a-43a9-43fc-be28-d7d3081d5e39\") " pod="openstack/kube-state-metrics-0" Dec 05 23:39:46 crc kubenswrapper[4734]: I1205 23:39:46.883620 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c76f3a-43a9-43fc-be28-d7d3081d5e39-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e4c76f3a-43a9-43fc-be28-d7d3081d5e39\") " pod="openstack/kube-state-metrics-0" Dec 05 23:39:46 crc kubenswrapper[4734]: I1205 23:39:46.883667 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4c76f3a-43a9-43fc-be28-d7d3081d5e39-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e4c76f3a-43a9-43fc-be28-d7d3081d5e39\") " pod="openstack/kube-state-metrics-0" Dec 05 23:39:46 crc kubenswrapper[4734]: I1205 23:39:46.890939 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4c76f3a-43a9-43fc-be28-d7d3081d5e39-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e4c76f3a-43a9-43fc-be28-d7d3081d5e39\") " pod="openstack/kube-state-metrics-0" Dec 05 23:39:46 crc kubenswrapper[4734]: I1205 23:39:46.892298 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c76f3a-43a9-43fc-be28-d7d3081d5e39-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e4c76f3a-43a9-43fc-be28-d7d3081d5e39\") " pod="openstack/kube-state-metrics-0" Dec 05 23:39:46 crc kubenswrapper[4734]: I1205 23:39:46.898029 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e4c76f3a-43a9-43fc-be28-d7d3081d5e39-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e4c76f3a-43a9-43fc-be28-d7d3081d5e39\") " pod="openstack/kube-state-metrics-0" Dec 05 23:39:46 crc kubenswrapper[4734]: I1205 23:39:46.914300 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2hv5\" (UniqueName: \"kubernetes.io/projected/e4c76f3a-43a9-43fc-be28-d7d3081d5e39-kube-api-access-z2hv5\") pod \"kube-state-metrics-0\" (UID: \"e4c76f3a-43a9-43fc-be28-d7d3081d5e39\") " pod="openstack/kube-state-metrics-0" Dec 05 23:39:47 crc kubenswrapper[4734]: I1205 23:39:47.118264 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 23:39:47 crc kubenswrapper[4734]: I1205 23:39:47.570897 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8e743d2-607c-4443-ac08-bb87e44e6f00","Type":"ContainerStarted","Data":"bef99b6024c17c0ee165ac012c0478670a825f58e6c16c18b15da999f92cbc6b"} Dec 05 23:39:47 crc kubenswrapper[4734]: I1205 23:39:47.646390 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cf3a204-9b47-4206-9964-deb892777324" path="/var/lib/kubelet/pods/4cf3a204-9b47-4206-9964-deb892777324/volumes" Dec 05 23:39:47 crc kubenswrapper[4734]: I1205 23:39:47.709189 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:39:47 crc kubenswrapper[4734]: W1205 23:39:47.817656 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4c76f3a_43a9_43fc_be28_d7d3081d5e39.slice/crio-4acb9df0fa96b3d908d88ee3a819df9e4791b9b5fe769c3f769e0ebbe0389b0d WatchSource:0}: Error finding container 4acb9df0fa96b3d908d88ee3a819df9e4791b9b5fe769c3f769e0ebbe0389b0d: Status 404 returned error can't find the container with id 4acb9df0fa96b3d908d88ee3a819df9e4791b9b5fe769c3f769e0ebbe0389b0d Dec 05 23:39:47 crc kubenswrapper[4734]: I1205 23:39:47.820802 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 23:39:48 crc kubenswrapper[4734]: I1205 23:39:48.588831 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8e743d2-607c-4443-ac08-bb87e44e6f00","Type":"ContainerStarted","Data":"241910885ec436b3e98643f3430f035407b7347dec29ce2fc83653fe38b760e6"} Dec 05 23:39:48 crc kubenswrapper[4734]: I1205 23:39:48.593768 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e4c76f3a-43a9-43fc-be28-d7d3081d5e39","Type":"ContainerStarted","Data":"23342d3ed3c642c37736feb65893af63560290bf5f35ed0b142b195c6ada8bf7"} Dec 05 23:39:48 crc kubenswrapper[4734]: I1205 23:39:48.593832 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e4c76f3a-43a9-43fc-be28-d7d3081d5e39","Type":"ContainerStarted","Data":"4acb9df0fa96b3d908d88ee3a819df9e4791b9b5fe769c3f769e0ebbe0389b0d"} Dec 05 23:39:48 crc kubenswrapper[4734]: I1205 23:39:48.595321 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 05 23:39:48 crc kubenswrapper[4734]: I1205 23:39:48.619750 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.264895673 podStartE2EDuration="2.619727572s" podCreationTimestamp="2025-12-05 23:39:46 +0000 UTC" firstStartedPulling="2025-12-05 23:39:47.821801886 +0000 UTC m=+1208.505206162" lastFinishedPulling="2025-12-05 23:39:48.176633795 +0000 UTC m=+1208.860038061" observedRunningTime="2025-12-05 23:39:48.61594269 +0000 UTC m=+1209.299346976" watchObservedRunningTime="2025-12-05 23:39:48.619727572 +0000 UTC m=+1209.303131848" Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.030415 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.144898 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c96cd173-4707-4edc-a92e-35db297082e2-logs\") pod \"c96cd173-4707-4edc-a92e-35db297082e2\" (UID: \"c96cd173-4707-4edc-a92e-35db297082e2\") " Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.144968 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c96cd173-4707-4edc-a92e-35db297082e2-combined-ca-bundle\") pod \"c96cd173-4707-4edc-a92e-35db297082e2\" (UID: \"c96cd173-4707-4edc-a92e-35db297082e2\") " Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.145018 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c96cd173-4707-4edc-a92e-35db297082e2-scripts\") pod \"c96cd173-4707-4edc-a92e-35db297082e2\" (UID: \"c96cd173-4707-4edc-a92e-35db297082e2\") " Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.145090 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c96cd173-4707-4edc-a92e-35db297082e2-horizon-secret-key\") pod \"c96cd173-4707-4edc-a92e-35db297082e2\" (UID: \"c96cd173-4707-4edc-a92e-35db297082e2\") " Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.145294 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c96cd173-4707-4edc-a92e-35db297082e2-horizon-tls-certs\") pod \"c96cd173-4707-4edc-a92e-35db297082e2\" (UID: \"c96cd173-4707-4edc-a92e-35db297082e2\") " Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.145379 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c96cd173-4707-4edc-a92e-35db297082e2-config-data\") pod \"c96cd173-4707-4edc-a92e-35db297082e2\" (UID: \"c96cd173-4707-4edc-a92e-35db297082e2\") " Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.145437 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw48z\" (UniqueName: \"kubernetes.io/projected/c96cd173-4707-4edc-a92e-35db297082e2-kube-api-access-rw48z\") pod \"c96cd173-4707-4edc-a92e-35db297082e2\" (UID: \"c96cd173-4707-4edc-a92e-35db297082e2\") " Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.145466 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c96cd173-4707-4edc-a92e-35db297082e2-logs" (OuterVolumeSpecName: "logs") pod "c96cd173-4707-4edc-a92e-35db297082e2" (UID: "c96cd173-4707-4edc-a92e-35db297082e2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.146028 4734 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c96cd173-4707-4edc-a92e-35db297082e2-logs\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.152457 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c96cd173-4707-4edc-a92e-35db297082e2-kube-api-access-rw48z" (OuterVolumeSpecName: "kube-api-access-rw48z") pod "c96cd173-4707-4edc-a92e-35db297082e2" (UID: "c96cd173-4707-4edc-a92e-35db297082e2"). InnerVolumeSpecName "kube-api-access-rw48z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.154709 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c96cd173-4707-4edc-a92e-35db297082e2-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c96cd173-4707-4edc-a92e-35db297082e2" (UID: "c96cd173-4707-4edc-a92e-35db297082e2"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.188668 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c96cd173-4707-4edc-a92e-35db297082e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c96cd173-4707-4edc-a92e-35db297082e2" (UID: "c96cd173-4707-4edc-a92e-35db297082e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.189005 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c96cd173-4707-4edc-a92e-35db297082e2-config-data" (OuterVolumeSpecName: "config-data") pod "c96cd173-4707-4edc-a92e-35db297082e2" (UID: "c96cd173-4707-4edc-a92e-35db297082e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.189849 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c96cd173-4707-4edc-a92e-35db297082e2-scripts" (OuterVolumeSpecName: "scripts") pod "c96cd173-4707-4edc-a92e-35db297082e2" (UID: "c96cd173-4707-4edc-a92e-35db297082e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.223766 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c96cd173-4707-4edc-a92e-35db297082e2-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "c96cd173-4707-4edc-a92e-35db297082e2" (UID: "c96cd173-4707-4edc-a92e-35db297082e2"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.248567 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c96cd173-4707-4edc-a92e-35db297082e2-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.248606 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw48z\" (UniqueName: \"kubernetes.io/projected/c96cd173-4707-4edc-a92e-35db297082e2-kube-api-access-rw48z\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.248620 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c96cd173-4707-4edc-a92e-35db297082e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.248633 4734 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c96cd173-4707-4edc-a92e-35db297082e2-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.248642 4734 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c96cd173-4707-4edc-a92e-35db297082e2-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.248651 4734 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c96cd173-4707-4edc-a92e-35db297082e2-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.616925 4734 generic.go:334] "Generic (PLEG): container finished" podID="c96cd173-4707-4edc-a92e-35db297082e2" containerID="8415f4e36ba22f55e16f34c1af4d5e303eef0e3ab5c7ab16635ff6313372655d" exitCode=137 Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.617059 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d469948dd-n7t4x" Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.634995 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a8e743d2-607c-4443-ac08-bb87e44e6f00" containerName="ceilometer-central-agent" containerID="cri-o://c112d4a7158b33b0ce4ad10dab55450daa9761583fb37858bf4806aa0b50ac2f" gracePeriod=30 Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.635178 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a8e743d2-607c-4443-ac08-bb87e44e6f00" containerName="proxy-httpd" containerID="cri-o://381adfd6a09a33ccc2fcc3b92590ddb39a0ad4037c9eb7ee67a169c2a6a513a0" gracePeriod=30 Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.635225 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a8e743d2-607c-4443-ac08-bb87e44e6f00" containerName="sg-core" containerID="cri-o://241910885ec436b3e98643f3430f035407b7347dec29ce2fc83653fe38b760e6" gracePeriod=30 Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.635258 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a8e743d2-607c-4443-ac08-bb87e44e6f00" containerName="ceilometer-notification-agent" containerID="cri-o://bef99b6024c17c0ee165ac012c0478670a825f58e6c16c18b15da999f92cbc6b" gracePeriod=30 Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.635968 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d469948dd-n7t4x" event={"ID":"c96cd173-4707-4edc-a92e-35db297082e2","Type":"ContainerDied","Data":"8415f4e36ba22f55e16f34c1af4d5e303eef0e3ab5c7ab16635ff6313372655d"} Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.636006 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d469948dd-n7t4x" event={"ID":"c96cd173-4707-4edc-a92e-35db297082e2","Type":"ContainerDied","Data":"c7b6d2ab251d04302cb99b80cf288a7b1730acbf3c36683bc32889fd89ca00f9"} Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.636030 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.636044 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8e743d2-607c-4443-ac08-bb87e44e6f00","Type":"ContainerStarted","Data":"381adfd6a09a33ccc2fcc3b92590ddb39a0ad4037c9eb7ee67a169c2a6a513a0"} Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.636065 4734 scope.go:117] "RemoveContainer" containerID="9514774e87490121e03a56ce448d088925b7977bc46c6a8cf0102e7f1954b7d8" Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.697663 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.553619834 podStartE2EDuration="5.697634253s" podCreationTimestamp="2025-12-05 23:39:44 +0000 UTC" firstStartedPulling="2025-12-05 23:39:45.570306606 +0000 UTC m=+1206.253710882" lastFinishedPulling="2025-12-05 23:39:48.714321025 +0000 UTC m=+1209.397725301" observedRunningTime="2025-12-05 23:39:49.692036427 +0000 UTC m=+1210.375440703" watchObservedRunningTime="2025-12-05 23:39:49.697634253 +0000 UTC m=+1210.381038519" Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.725985 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5d469948dd-n7t4x"] Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.734547 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5d469948dd-n7t4x"] Dec 05 23:39:49 crc kubenswrapper[4734]: I1205 23:39:49.854297 4734 scope.go:117] "RemoveContainer" containerID="8415f4e36ba22f55e16f34c1af4d5e303eef0e3ab5c7ab16635ff6313372655d" Dec 05 23:39:50 crc kubenswrapper[4734]: I1205 23:39:49.999506 4734 scope.go:117] "RemoveContainer" containerID="9514774e87490121e03a56ce448d088925b7977bc46c6a8cf0102e7f1954b7d8" Dec 05 23:39:50 crc kubenswrapper[4734]: E1205 23:39:50.000210 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9514774e87490121e03a56ce448d088925b7977bc46c6a8cf0102e7f1954b7d8\": container with ID starting with 9514774e87490121e03a56ce448d088925b7977bc46c6a8cf0102e7f1954b7d8 not found: ID does not exist" containerID="9514774e87490121e03a56ce448d088925b7977bc46c6a8cf0102e7f1954b7d8" Dec 05 23:39:50 crc kubenswrapper[4734]: I1205 23:39:50.000279 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9514774e87490121e03a56ce448d088925b7977bc46c6a8cf0102e7f1954b7d8"} err="failed to get container status \"9514774e87490121e03a56ce448d088925b7977bc46c6a8cf0102e7f1954b7d8\": rpc error: code = NotFound desc = could not find container \"9514774e87490121e03a56ce448d088925b7977bc46c6a8cf0102e7f1954b7d8\": container with ID starting with 9514774e87490121e03a56ce448d088925b7977bc46c6a8cf0102e7f1954b7d8 not found: ID does not exist" Dec 05 23:39:50 crc kubenswrapper[4734]: I1205 23:39:50.000319 4734 scope.go:117] "RemoveContainer" containerID="8415f4e36ba22f55e16f34c1af4d5e303eef0e3ab5c7ab16635ff6313372655d" Dec 05 23:39:50 crc kubenswrapper[4734]: E1205 23:39:50.000794 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8415f4e36ba22f55e16f34c1af4d5e303eef0e3ab5c7ab16635ff6313372655d\": container with ID starting with 8415f4e36ba22f55e16f34c1af4d5e303eef0e3ab5c7ab16635ff6313372655d not found: ID does not exist" containerID="8415f4e36ba22f55e16f34c1af4d5e303eef0e3ab5c7ab16635ff6313372655d" Dec 05 23:39:50 crc kubenswrapper[4734]: I1205 23:39:50.000823 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8415f4e36ba22f55e16f34c1af4d5e303eef0e3ab5c7ab16635ff6313372655d"} err="failed to get container status \"8415f4e36ba22f55e16f34c1af4d5e303eef0e3ab5c7ab16635ff6313372655d\": rpc error: code = NotFound desc = could not find container \"8415f4e36ba22f55e16f34c1af4d5e303eef0e3ab5c7ab16635ff6313372655d\": container with ID starting with 8415f4e36ba22f55e16f34c1af4d5e303eef0e3ab5c7ab16635ff6313372655d not found: ID does not exist" Dec 05 23:39:50 crc kubenswrapper[4734]: E1205 23:39:50.259724 4734 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8e743d2_607c_4443_ac08_bb87e44e6f00.slice/crio-conmon-bef99b6024c17c0ee165ac012c0478670a825f58e6c16c18b15da999f92cbc6b.scope\": RecentStats: unable to find data in memory cache]" Dec 05 23:39:50 crc kubenswrapper[4734]: I1205 23:39:50.444939 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:39:50 crc kubenswrapper[4734]: I1205 23:39:50.445024 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:39:50 crc kubenswrapper[4734]: I1205 23:39:50.445100 4734 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 05 23:39:50 crc kubenswrapper[4734]: I1205 23:39:50.446175 4734 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5119dd9005e526fae1b15071e6d704440bd8834afc5ec6ce50aaa9f27c74ff90"} pod="openshift-machine-config-operator/machine-config-daemon-vn94d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 23:39:50 crc kubenswrapper[4734]: I1205 23:39:50.446274 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" containerID="cri-o://5119dd9005e526fae1b15071e6d704440bd8834afc5ec6ce50aaa9f27c74ff90" gracePeriod=600 Dec 05 23:39:50 crc kubenswrapper[4734]: I1205 23:39:50.786058 4734 generic.go:334] "Generic (PLEG): container finished" podID="65758270-a7a7-46b5-af95-0588daf9fa86" containerID="5119dd9005e526fae1b15071e6d704440bd8834afc5ec6ce50aaa9f27c74ff90" exitCode=0 Dec 05 23:39:50 crc kubenswrapper[4734]: I1205 23:39:50.786628 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerDied","Data":"5119dd9005e526fae1b15071e6d704440bd8834afc5ec6ce50aaa9f27c74ff90"} Dec 05 23:39:50 crc kubenswrapper[4734]: I1205 23:39:50.786678 4734 scope.go:117] "RemoveContainer" containerID="ce94e8a7ce0afa1b302b0a1993b5d90206c505bc6302ab5507859a6eab1dd7e0" Dec 05 23:39:50 crc kubenswrapper[4734]: I1205 23:39:50.879767 4734 generic.go:334] "Generic (PLEG): container finished" podID="a8e743d2-607c-4443-ac08-bb87e44e6f00" containerID="381adfd6a09a33ccc2fcc3b92590ddb39a0ad4037c9eb7ee67a169c2a6a513a0" exitCode=0 Dec 05 23:39:50 crc kubenswrapper[4734]: I1205 23:39:50.880371 4734 generic.go:334] "Generic (PLEG): container finished" podID="a8e743d2-607c-4443-ac08-bb87e44e6f00" containerID="241910885ec436b3e98643f3430f035407b7347dec29ce2fc83653fe38b760e6" exitCode=2 Dec 05 23:39:50 crc kubenswrapper[4734]: I1205 23:39:50.880459 4734 generic.go:334] "Generic (PLEG): container finished" podID="a8e743d2-607c-4443-ac08-bb87e44e6f00" containerID="bef99b6024c17c0ee165ac012c0478670a825f58e6c16c18b15da999f92cbc6b" exitCode=0 Dec 05 23:39:50 crc kubenswrapper[4734]: I1205 23:39:50.880630 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8e743d2-607c-4443-ac08-bb87e44e6f00","Type":"ContainerDied","Data":"381adfd6a09a33ccc2fcc3b92590ddb39a0ad4037c9eb7ee67a169c2a6a513a0"} Dec 05 23:39:50 crc kubenswrapper[4734]: I1205 23:39:50.880697 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8e743d2-607c-4443-ac08-bb87e44e6f00","Type":"ContainerDied","Data":"241910885ec436b3e98643f3430f035407b7347dec29ce2fc83653fe38b760e6"} Dec 05 23:39:50 crc kubenswrapper[4734]: I1205 23:39:50.880713 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8e743d2-607c-4443-ac08-bb87e44e6f00","Type":"ContainerDied","Data":"bef99b6024c17c0ee165ac012c0478670a825f58e6c16c18b15da999f92cbc6b"} Dec 05 23:39:51 crc kubenswrapper[4734]: I1205 23:39:51.627426 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c96cd173-4707-4edc-a92e-35db297082e2" path="/var/lib/kubelet/pods/c96cd173-4707-4edc-a92e-35db297082e2/volumes" Dec 05 23:39:51 crc kubenswrapper[4734]: I1205 23:39:51.894728 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerStarted","Data":"b56de5effd3c2004c857decd42f072613bd8b7411853b07107e3e799cc6c9cfb"} Dec 05 23:39:52 crc kubenswrapper[4734]: I1205 23:39:52.719106 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 23:39:52 crc kubenswrapper[4734]: I1205 23:39:52.861784 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll5bj\" (UniqueName: \"kubernetes.io/projected/a8e743d2-607c-4443-ac08-bb87e44e6f00-kube-api-access-ll5bj\") pod \"a8e743d2-607c-4443-ac08-bb87e44e6f00\" (UID: \"a8e743d2-607c-4443-ac08-bb87e44e6f00\") " Dec 05 23:39:52 crc kubenswrapper[4734]: I1205 23:39:52.861902 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8e743d2-607c-4443-ac08-bb87e44e6f00-log-httpd\") pod \"a8e743d2-607c-4443-ac08-bb87e44e6f00\" (UID: \"a8e743d2-607c-4443-ac08-bb87e44e6f00\") " Dec 05 23:39:52 crc kubenswrapper[4734]: I1205 23:39:52.861954 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8e743d2-607c-4443-ac08-bb87e44e6f00-config-data\") pod \"a8e743d2-607c-4443-ac08-bb87e44e6f00\" (UID: \"a8e743d2-607c-4443-ac08-bb87e44e6f00\") " Dec 05 23:39:52 crc kubenswrapper[4734]: I1205 23:39:52.862067 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8e743d2-607c-4443-ac08-bb87e44e6f00-scripts\") pod \"a8e743d2-607c-4443-ac08-bb87e44e6f00\" (UID: \"a8e743d2-607c-4443-ac08-bb87e44e6f00\") " Dec 05 23:39:52 crc kubenswrapper[4734]: I1205 23:39:52.862114 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a8e743d2-607c-4443-ac08-bb87e44e6f00-sg-core-conf-yaml\") pod \"a8e743d2-607c-4443-ac08-bb87e44e6f00\" (UID: \"a8e743d2-607c-4443-ac08-bb87e44e6f00\") " Dec 05 23:39:52 crc kubenswrapper[4734]: I1205 23:39:52.862246 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8e743d2-607c-4443-ac08-bb87e44e6f00-combined-ca-bundle\") pod \"a8e743d2-607c-4443-ac08-bb87e44e6f00\" (UID: \"a8e743d2-607c-4443-ac08-bb87e44e6f00\") " Dec 05 23:39:52 crc kubenswrapper[4734]: I1205 23:39:52.862276 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8e743d2-607c-4443-ac08-bb87e44e6f00-run-httpd\") pod \"a8e743d2-607c-4443-ac08-bb87e44e6f00\" (UID: \"a8e743d2-607c-4443-ac08-bb87e44e6f00\") " Dec 05 23:39:52 crc kubenswrapper[4734]: I1205 23:39:52.863017 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8e743d2-607c-4443-ac08-bb87e44e6f00-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a8e743d2-607c-4443-ac08-bb87e44e6f00" (UID: "a8e743d2-607c-4443-ac08-bb87e44e6f00"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:39:52 crc kubenswrapper[4734]: I1205 23:39:52.863108 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8e743d2-607c-4443-ac08-bb87e44e6f00-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a8e743d2-607c-4443-ac08-bb87e44e6f00" (UID: "a8e743d2-607c-4443-ac08-bb87e44e6f00"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:39:52 crc kubenswrapper[4734]: I1205 23:39:52.871859 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8e743d2-607c-4443-ac08-bb87e44e6f00-kube-api-access-ll5bj" (OuterVolumeSpecName: "kube-api-access-ll5bj") pod "a8e743d2-607c-4443-ac08-bb87e44e6f00" (UID: "a8e743d2-607c-4443-ac08-bb87e44e6f00"). InnerVolumeSpecName "kube-api-access-ll5bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:39:52 crc kubenswrapper[4734]: I1205 23:39:52.872187 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8e743d2-607c-4443-ac08-bb87e44e6f00-scripts" (OuterVolumeSpecName: "scripts") pod "a8e743d2-607c-4443-ac08-bb87e44e6f00" (UID: "a8e743d2-607c-4443-ac08-bb87e44e6f00"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:52 crc kubenswrapper[4734]: I1205 23:39:52.912047 4734 generic.go:334] "Generic (PLEG): container finished" podID="a8e743d2-607c-4443-ac08-bb87e44e6f00" containerID="c112d4a7158b33b0ce4ad10dab55450daa9761583fb37858bf4806aa0b50ac2f" exitCode=0 Dec 05 23:39:52 crc kubenswrapper[4734]: I1205 23:39:52.912131 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8e743d2-607c-4443-ac08-bb87e44e6f00","Type":"ContainerDied","Data":"c112d4a7158b33b0ce4ad10dab55450daa9761583fb37858bf4806aa0b50ac2f"} Dec 05 23:39:52 crc kubenswrapper[4734]: I1205 23:39:52.912189 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8e743d2-607c-4443-ac08-bb87e44e6f00","Type":"ContainerDied","Data":"814db3b7e2072e8e85ed4979937c44cc9386e9d0afb9b9c1b9cf95a0563b038e"} Dec 05 23:39:52 crc kubenswrapper[4734]: I1205 23:39:52.912197 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 23:39:52 crc kubenswrapper[4734]: I1205 23:39:52.912215 4734 scope.go:117] "RemoveContainer" containerID="381adfd6a09a33ccc2fcc3b92590ddb39a0ad4037c9eb7ee67a169c2a6a513a0" Dec 05 23:39:52 crc kubenswrapper[4734]: I1205 23:39:52.943187 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8e743d2-607c-4443-ac08-bb87e44e6f00-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a8e743d2-607c-4443-ac08-bb87e44e6f00" (UID: "a8e743d2-607c-4443-ac08-bb87e44e6f00"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:52 crc kubenswrapper[4734]: I1205 23:39:52.965312 4734 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a8e743d2-607c-4443-ac08-bb87e44e6f00-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:52 crc kubenswrapper[4734]: I1205 23:39:52.965357 4734 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8e743d2-607c-4443-ac08-bb87e44e6f00-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:52 crc kubenswrapper[4734]: I1205 23:39:52.965372 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll5bj\" (UniqueName: \"kubernetes.io/projected/a8e743d2-607c-4443-ac08-bb87e44e6f00-kube-api-access-ll5bj\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:52 crc kubenswrapper[4734]: I1205 23:39:52.965384 4734 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8e743d2-607c-4443-ac08-bb87e44e6f00-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:52 crc kubenswrapper[4734]: I1205 23:39:52.965396 4734 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8e743d2-607c-4443-ac08-bb87e44e6f00-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:52 crc kubenswrapper[4734]: I1205 23:39:52.977679 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8e743d2-607c-4443-ac08-bb87e44e6f00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8e743d2-607c-4443-ac08-bb87e44e6f00" (UID: "a8e743d2-607c-4443-ac08-bb87e44e6f00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.011757 4734 scope.go:117] "RemoveContainer" containerID="241910885ec436b3e98643f3430f035407b7347dec29ce2fc83653fe38b760e6" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.019161 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8e743d2-607c-4443-ac08-bb87e44e6f00-config-data" (OuterVolumeSpecName: "config-data") pod "a8e743d2-607c-4443-ac08-bb87e44e6f00" (UID: "a8e743d2-607c-4443-ac08-bb87e44e6f00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.036559 4734 scope.go:117] "RemoveContainer" containerID="bef99b6024c17c0ee165ac012c0478670a825f58e6c16c18b15da999f92cbc6b" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.063177 4734 scope.go:117] "RemoveContainer" containerID="c112d4a7158b33b0ce4ad10dab55450daa9761583fb37858bf4806aa0b50ac2f" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.067258 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8e743d2-607c-4443-ac08-bb87e44e6f00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.067424 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8e743d2-607c-4443-ac08-bb87e44e6f00-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.089058 4734 scope.go:117] "RemoveContainer" containerID="381adfd6a09a33ccc2fcc3b92590ddb39a0ad4037c9eb7ee67a169c2a6a513a0" Dec 05 23:39:53 crc kubenswrapper[4734]: E1205 23:39:53.089697 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"381adfd6a09a33ccc2fcc3b92590ddb39a0ad4037c9eb7ee67a169c2a6a513a0\": container with ID starting with 381adfd6a09a33ccc2fcc3b92590ddb39a0ad4037c9eb7ee67a169c2a6a513a0 not found: ID does not exist" containerID="381adfd6a09a33ccc2fcc3b92590ddb39a0ad4037c9eb7ee67a169c2a6a513a0" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.089750 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"381adfd6a09a33ccc2fcc3b92590ddb39a0ad4037c9eb7ee67a169c2a6a513a0"} err="failed to get container status \"381adfd6a09a33ccc2fcc3b92590ddb39a0ad4037c9eb7ee67a169c2a6a513a0\": rpc error: code = NotFound desc = could not find container \"381adfd6a09a33ccc2fcc3b92590ddb39a0ad4037c9eb7ee67a169c2a6a513a0\": container with ID starting with 381adfd6a09a33ccc2fcc3b92590ddb39a0ad4037c9eb7ee67a169c2a6a513a0 not found: ID does not exist" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.089793 4734 scope.go:117] "RemoveContainer" containerID="241910885ec436b3e98643f3430f035407b7347dec29ce2fc83653fe38b760e6" Dec 05 23:39:53 crc kubenswrapper[4734]: E1205 23:39:53.090482 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"241910885ec436b3e98643f3430f035407b7347dec29ce2fc83653fe38b760e6\": container with ID starting with 241910885ec436b3e98643f3430f035407b7347dec29ce2fc83653fe38b760e6 not found: ID does not exist" containerID="241910885ec436b3e98643f3430f035407b7347dec29ce2fc83653fe38b760e6" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.090578 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"241910885ec436b3e98643f3430f035407b7347dec29ce2fc83653fe38b760e6"} err="failed to get container status \"241910885ec436b3e98643f3430f035407b7347dec29ce2fc83653fe38b760e6\": rpc error: code = NotFound desc = could not find container \"241910885ec436b3e98643f3430f035407b7347dec29ce2fc83653fe38b760e6\": container with ID starting with 241910885ec436b3e98643f3430f035407b7347dec29ce2fc83653fe38b760e6 not found: ID does not exist" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.090634 4734 scope.go:117] "RemoveContainer" containerID="bef99b6024c17c0ee165ac012c0478670a825f58e6c16c18b15da999f92cbc6b" Dec 05 23:39:53 crc kubenswrapper[4734]: E1205 23:39:53.090990 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bef99b6024c17c0ee165ac012c0478670a825f58e6c16c18b15da999f92cbc6b\": container with ID starting with bef99b6024c17c0ee165ac012c0478670a825f58e6c16c18b15da999f92cbc6b not found: ID does not exist" containerID="bef99b6024c17c0ee165ac012c0478670a825f58e6c16c18b15da999f92cbc6b" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.091015 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bef99b6024c17c0ee165ac012c0478670a825f58e6c16c18b15da999f92cbc6b"} err="failed to get container status \"bef99b6024c17c0ee165ac012c0478670a825f58e6c16c18b15da999f92cbc6b\": rpc error: code = NotFound desc = could not find container \"bef99b6024c17c0ee165ac012c0478670a825f58e6c16c18b15da999f92cbc6b\": container with ID starting with bef99b6024c17c0ee165ac012c0478670a825f58e6c16c18b15da999f92cbc6b not found: ID does not exist" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.091033 4734 scope.go:117] "RemoveContainer" containerID="c112d4a7158b33b0ce4ad10dab55450daa9761583fb37858bf4806aa0b50ac2f" Dec 05 23:39:53 crc kubenswrapper[4734]: E1205 23:39:53.091447 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c112d4a7158b33b0ce4ad10dab55450daa9761583fb37858bf4806aa0b50ac2f\": container with ID starting with c112d4a7158b33b0ce4ad10dab55450daa9761583fb37858bf4806aa0b50ac2f not found: ID does not exist" containerID="c112d4a7158b33b0ce4ad10dab55450daa9761583fb37858bf4806aa0b50ac2f" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.091491 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c112d4a7158b33b0ce4ad10dab55450daa9761583fb37858bf4806aa0b50ac2f"} err="failed to get container status \"c112d4a7158b33b0ce4ad10dab55450daa9761583fb37858bf4806aa0b50ac2f\": rpc error: code = NotFound desc = could not find container \"c112d4a7158b33b0ce4ad10dab55450daa9761583fb37858bf4806aa0b50ac2f\": container with ID starting with c112d4a7158b33b0ce4ad10dab55450daa9761583fb37858bf4806aa0b50ac2f not found: ID does not exist" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.099866 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.100201 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="af968080-3e37-4034-90a7-0b654e68ee89" containerName="glance-log" containerID="cri-o://880b3909e293946d8333fcbf933f98a0d14d6594cac4b9529e4bbc96550c5578" gracePeriod=30 Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.100332 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="af968080-3e37-4034-90a7-0b654e68ee89" containerName="glance-httpd" containerID="cri-o://1b47e509e05bda36df057f4e42fdaa560d8dbd474320378bd221a11489d58c65" gracePeriod=30 Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.248273 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.256179 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.280855 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:39:53 crc kubenswrapper[4734]: E1205 23:39:53.281542 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8e743d2-607c-4443-ac08-bb87e44e6f00" containerName="proxy-httpd" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.281636 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8e743d2-607c-4443-ac08-bb87e44e6f00" containerName="proxy-httpd" Dec 05 23:39:53 crc kubenswrapper[4734]: E1205 23:39:53.281693 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8e743d2-607c-4443-ac08-bb87e44e6f00" containerName="sg-core" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.281742 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8e743d2-607c-4443-ac08-bb87e44e6f00" containerName="sg-core" Dec 05 23:39:53 crc kubenswrapper[4734]: E1205 23:39:53.281797 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c96cd173-4707-4edc-a92e-35db297082e2" containerName="horizon" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.281843 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="c96cd173-4707-4edc-a92e-35db297082e2" containerName="horizon" Dec 05 23:39:53 crc kubenswrapper[4734]: E1205 23:39:53.281907 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8e743d2-607c-4443-ac08-bb87e44e6f00" containerName="ceilometer-central-agent" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.281965 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8e743d2-607c-4443-ac08-bb87e44e6f00" containerName="ceilometer-central-agent" Dec 05 23:39:53 crc kubenswrapper[4734]: E1205 23:39:53.282025 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c96cd173-4707-4edc-a92e-35db297082e2" containerName="horizon-log" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.282090 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="c96cd173-4707-4edc-a92e-35db297082e2" containerName="horizon-log" Dec 05 23:39:53 crc kubenswrapper[4734]: E1205 23:39:53.282141 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8e743d2-607c-4443-ac08-bb87e44e6f00" containerName="ceilometer-notification-agent" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.282190 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8e743d2-607c-4443-ac08-bb87e44e6f00" containerName="ceilometer-notification-agent" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.282416 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8e743d2-607c-4443-ac08-bb87e44e6f00" containerName="sg-core" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.282475 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="c96cd173-4707-4edc-a92e-35db297082e2" containerName="horizon" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.282554 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8e743d2-607c-4443-ac08-bb87e44e6f00" containerName="ceilometer-notification-agent" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.282614 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8e743d2-607c-4443-ac08-bb87e44e6f00" containerName="proxy-httpd" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.282676 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="c96cd173-4707-4edc-a92e-35db297082e2" containerName="horizon-log" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.282728 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8e743d2-607c-4443-ac08-bb87e44e6f00" containerName="ceilometer-central-agent" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.284389 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.287125 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.287962 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.288193 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.308501 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.373254 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " pod="openstack/ceilometer-0" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.373357 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-config-data\") pod \"ceilometer-0\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " pod="openstack/ceilometer-0" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.373410 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-log-httpd\") pod \"ceilometer-0\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " pod="openstack/ceilometer-0" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.373426 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " pod="openstack/ceilometer-0" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.373463 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-scripts\") pod \"ceilometer-0\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " pod="openstack/ceilometer-0" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.373584 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqsnf\" (UniqueName: \"kubernetes.io/projected/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-kube-api-access-qqsnf\") pod \"ceilometer-0\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " pod="openstack/ceilometer-0" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.373659 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " pod="openstack/ceilometer-0" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.373738 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-run-httpd\") pod \"ceilometer-0\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " pod="openstack/ceilometer-0" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.475989 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " pod="openstack/ceilometer-0" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.476121 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-run-httpd\") pod \"ceilometer-0\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " pod="openstack/ceilometer-0" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.476181 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " pod="openstack/ceilometer-0" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.476203 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-config-data\") pod \"ceilometer-0\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " pod="openstack/ceilometer-0" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.476250 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-log-httpd\") pod \"ceilometer-0\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " pod="openstack/ceilometer-0" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.476270 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " pod="openstack/ceilometer-0" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.476320 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-scripts\") pod \"ceilometer-0\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " pod="openstack/ceilometer-0" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.476354 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqsnf\" (UniqueName: \"kubernetes.io/projected/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-kube-api-access-qqsnf\") pod \"ceilometer-0\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " pod="openstack/ceilometer-0" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.477779 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-log-httpd\") pod \"ceilometer-0\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " pod="openstack/ceilometer-0" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.478061 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-run-httpd\") pod \"ceilometer-0\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " pod="openstack/ceilometer-0" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.483362 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " pod="openstack/ceilometer-0" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.484400 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " pod="openstack/ceilometer-0" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.484937 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " pod="openstack/ceilometer-0" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.493493 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-scripts\") pod \"ceilometer-0\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " pod="openstack/ceilometer-0" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.494137 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-config-data\") pod \"ceilometer-0\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " pod="openstack/ceilometer-0" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.507144 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqsnf\" (UniqueName: \"kubernetes.io/projected/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-kube-api-access-qqsnf\") pod \"ceilometer-0\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " pod="openstack/ceilometer-0" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.625436 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8e743d2-607c-4443-ac08-bb87e44e6f00" path="/var/lib/kubelet/pods/a8e743d2-607c-4443-ac08-bb87e44e6f00/volumes" Dec 05 23:39:53 crc kubenswrapper[4734]: I1205 23:39:53.642886 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 23:39:54 crc kubenswrapper[4734]: I1205 23:39:54.014199 4734 generic.go:334] "Generic (PLEG): container finished" podID="af968080-3e37-4034-90a7-0b654e68ee89" containerID="880b3909e293946d8333fcbf933f98a0d14d6594cac4b9529e4bbc96550c5578" exitCode=143 Dec 05 23:39:54 crc kubenswrapper[4734]: I1205 23:39:54.016069 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"af968080-3e37-4034-90a7-0b654e68ee89","Type":"ContainerDied","Data":"880b3909e293946d8333fcbf933f98a0d14d6594cac4b9529e4bbc96550c5578"} Dec 05 23:39:54 crc kubenswrapper[4734]: I1205 23:39:54.301187 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:39:54 crc kubenswrapper[4734]: I1205 23:39:54.349268 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:39:55 crc kubenswrapper[4734]: I1205 23:39:55.046304 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350","Type":"ContainerStarted","Data":"30302b4abf366dfba310850d0125358565c63a7dfc4569e6a155078856267070"} Dec 05 23:39:55 crc kubenswrapper[4734]: I1205 23:39:55.046807 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350","Type":"ContainerStarted","Data":"169d9790764ca94a7080c24a467cfd5e18b0a92f7159da00c2b7158dbcc6390b"} Dec 05 23:39:56 crc kubenswrapper[4734]: I1205 23:39:56.060625 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350","Type":"ContainerStarted","Data":"908819d566bd6792c4502a5e6fba71c14d9e6bfa7c5c91dba03ca94e9316d98f"} Dec 05 23:39:56 crc kubenswrapper[4734]: I1205 23:39:56.727494 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 23:39:56 crc kubenswrapper[4734]: I1205 23:39:56.728371 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="05a4de5c-b10c-4d66-b3dd-0468357229b0" containerName="glance-log" containerID="cri-o://caa15c8be00b0cd026dbf85abad91cee75f3e43c982a4959a482b572e2817c84" gracePeriod=30 Dec 05 23:39:56 crc kubenswrapper[4734]: I1205 23:39:56.728643 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="05a4de5c-b10c-4d66-b3dd-0468357229b0" containerName="glance-httpd" containerID="cri-o://6fb7398bf31d93c15ed3a4e6788714ab58c7c73669d5c0a4145e85018ce10e05" gracePeriod=30 Dec 05 23:39:56 crc kubenswrapper[4734]: I1205 23:39:56.833275 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 23:39:56 crc kubenswrapper[4734]: I1205 23:39:56.959389 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd4fw\" (UniqueName: \"kubernetes.io/projected/af968080-3e37-4034-90a7-0b654e68ee89-kube-api-access-zd4fw\") pod \"af968080-3e37-4034-90a7-0b654e68ee89\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " Dec 05 23:39:56 crc kubenswrapper[4734]: I1205 23:39:56.959492 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af968080-3e37-4034-90a7-0b654e68ee89-scripts\") pod \"af968080-3e37-4034-90a7-0b654e68ee89\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " Dec 05 23:39:56 crc kubenswrapper[4734]: I1205 23:39:56.959561 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af968080-3e37-4034-90a7-0b654e68ee89-combined-ca-bundle\") pod \"af968080-3e37-4034-90a7-0b654e68ee89\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " Dec 05 23:39:56 crc kubenswrapper[4734]: I1205 23:39:56.959589 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af968080-3e37-4034-90a7-0b654e68ee89-internal-tls-certs\") pod \"af968080-3e37-4034-90a7-0b654e68ee89\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " Dec 05 23:39:56 crc kubenswrapper[4734]: I1205 23:39:56.959726 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af968080-3e37-4034-90a7-0b654e68ee89-httpd-run\") pod \"af968080-3e37-4034-90a7-0b654e68ee89\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " Dec 05 23:39:56 crc kubenswrapper[4734]: I1205 23:39:56.959750 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af968080-3e37-4034-90a7-0b654e68ee89-logs\") pod \"af968080-3e37-4034-90a7-0b654e68ee89\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " Dec 05 23:39:56 crc kubenswrapper[4734]: I1205 23:39:56.959820 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af968080-3e37-4034-90a7-0b654e68ee89-config-data\") pod \"af968080-3e37-4034-90a7-0b654e68ee89\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " Dec 05 23:39:56 crc kubenswrapper[4734]: I1205 23:39:56.959858 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"af968080-3e37-4034-90a7-0b654e68ee89\" (UID: \"af968080-3e37-4034-90a7-0b654e68ee89\") " Dec 05 23:39:56 crc kubenswrapper[4734]: I1205 23:39:56.962864 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af968080-3e37-4034-90a7-0b654e68ee89-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "af968080-3e37-4034-90a7-0b654e68ee89" (UID: "af968080-3e37-4034-90a7-0b654e68ee89"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:39:56 crc kubenswrapper[4734]: I1205 23:39:56.963149 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af968080-3e37-4034-90a7-0b654e68ee89-logs" (OuterVolumeSpecName: "logs") pod "af968080-3e37-4034-90a7-0b654e68ee89" (UID: "af968080-3e37-4034-90a7-0b654e68ee89"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:39:56 crc kubenswrapper[4734]: I1205 23:39:56.968411 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af968080-3e37-4034-90a7-0b654e68ee89-kube-api-access-zd4fw" (OuterVolumeSpecName: "kube-api-access-zd4fw") pod "af968080-3e37-4034-90a7-0b654e68ee89" (UID: "af968080-3e37-4034-90a7-0b654e68ee89"). InnerVolumeSpecName "kube-api-access-zd4fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:39:56 crc kubenswrapper[4734]: I1205 23:39:56.973378 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "af968080-3e37-4034-90a7-0b654e68ee89" (UID: "af968080-3e37-4034-90a7-0b654e68ee89"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 23:39:56 crc kubenswrapper[4734]: I1205 23:39:56.974690 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af968080-3e37-4034-90a7-0b654e68ee89-scripts" (OuterVolumeSpecName: "scripts") pod "af968080-3e37-4034-90a7-0b654e68ee89" (UID: "af968080-3e37-4034-90a7-0b654e68ee89"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.007987 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af968080-3e37-4034-90a7-0b654e68ee89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af968080-3e37-4034-90a7-0b654e68ee89" (UID: "af968080-3e37-4034-90a7-0b654e68ee89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.040716 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af968080-3e37-4034-90a7-0b654e68ee89-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "af968080-3e37-4034-90a7-0b654e68ee89" (UID: "af968080-3e37-4034-90a7-0b654e68ee89"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.067239 4734 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af968080-3e37-4034-90a7-0b654e68ee89-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.067318 4734 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af968080-3e37-4034-90a7-0b654e68ee89-logs\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.067341 4734 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.067372 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd4fw\" (UniqueName: \"kubernetes.io/projected/af968080-3e37-4034-90a7-0b654e68ee89-kube-api-access-zd4fw\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.067382 4734 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af968080-3e37-4034-90a7-0b654e68ee89-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.067393 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af968080-3e37-4034-90a7-0b654e68ee89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.067402 4734 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af968080-3e37-4034-90a7-0b654e68ee89-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.070223 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af968080-3e37-4034-90a7-0b654e68ee89-config-data" (OuterVolumeSpecName: "config-data") pod "af968080-3e37-4034-90a7-0b654e68ee89" (UID: "af968080-3e37-4034-90a7-0b654e68ee89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.076249 4734 generic.go:334] "Generic (PLEG): container finished" podID="05a4de5c-b10c-4d66-b3dd-0468357229b0" containerID="caa15c8be00b0cd026dbf85abad91cee75f3e43c982a4959a482b572e2817c84" exitCode=143 Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.076364 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"05a4de5c-b10c-4d66-b3dd-0468357229b0","Type":"ContainerDied","Data":"caa15c8be00b0cd026dbf85abad91cee75f3e43c982a4959a482b572e2817c84"} Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.081370 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350","Type":"ContainerStarted","Data":"d28f8c59e2b624118f759e364781e1afa8a6cdbbf4a691768c964f62c6637a01"} Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.083288 4734 generic.go:334] "Generic (PLEG): container finished" podID="af968080-3e37-4034-90a7-0b654e68ee89" containerID="1b47e509e05bda36df057f4e42fdaa560d8dbd474320378bd221a11489d58c65" exitCode=0 Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.083327 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"af968080-3e37-4034-90a7-0b654e68ee89","Type":"ContainerDied","Data":"1b47e509e05bda36df057f4e42fdaa560d8dbd474320378bd221a11489d58c65"} Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.083348 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"af968080-3e37-4034-90a7-0b654e68ee89","Type":"ContainerDied","Data":"2e3140b54dbe92c6d8dc229652ca7c5076899456ce76f964515283f40ffe415a"} Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.083366 4734 scope.go:117] "RemoveContainer" containerID="1b47e509e05bda36df057f4e42fdaa560d8dbd474320378bd221a11489d58c65" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.083436 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.092327 4734 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.122205 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.123385 4734 scope.go:117] "RemoveContainer" containerID="880b3909e293946d8333fcbf933f98a0d14d6594cac4b9529e4bbc96550c5578" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.135920 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.153711 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.163453 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 23:39:57 crc kubenswrapper[4734]: E1205 23:39:57.163947 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af968080-3e37-4034-90a7-0b654e68ee89" containerName="glance-httpd" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.163967 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="af968080-3e37-4034-90a7-0b654e68ee89" containerName="glance-httpd" Dec 05 23:39:57 crc kubenswrapper[4734]: E1205 23:39:57.163983 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af968080-3e37-4034-90a7-0b654e68ee89" containerName="glance-log" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.163990 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="af968080-3e37-4034-90a7-0b654e68ee89" containerName="glance-log" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.164195 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="af968080-3e37-4034-90a7-0b654e68ee89" containerName="glance-httpd" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.164223 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="af968080-3e37-4034-90a7-0b654e68ee89" containerName="glance-log" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.165279 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.167899 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.168335 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.169748 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af968080-3e37-4034-90a7-0b654e68ee89-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.169768 4734 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.173954 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.174169 4734 scope.go:117] "RemoveContainer" containerID="1b47e509e05bda36df057f4e42fdaa560d8dbd474320378bd221a11489d58c65" Dec 05 23:39:57 crc kubenswrapper[4734]: E1205 23:39:57.219707 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b47e509e05bda36df057f4e42fdaa560d8dbd474320378bd221a11489d58c65\": container with ID starting with 1b47e509e05bda36df057f4e42fdaa560d8dbd474320378bd221a11489d58c65 not found: ID does not exist" containerID="1b47e509e05bda36df057f4e42fdaa560d8dbd474320378bd221a11489d58c65" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.219777 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b47e509e05bda36df057f4e42fdaa560d8dbd474320378bd221a11489d58c65"} err="failed to get container status \"1b47e509e05bda36df057f4e42fdaa560d8dbd474320378bd221a11489d58c65\": rpc error: code = NotFound desc = could not find container \"1b47e509e05bda36df057f4e42fdaa560d8dbd474320378bd221a11489d58c65\": container with ID starting with 1b47e509e05bda36df057f4e42fdaa560d8dbd474320378bd221a11489d58c65 not found: ID does not exist" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.219815 4734 scope.go:117] "RemoveContainer" containerID="880b3909e293946d8333fcbf933f98a0d14d6594cac4b9529e4bbc96550c5578" Dec 05 23:39:57 crc kubenswrapper[4734]: E1205 23:39:57.222073 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"880b3909e293946d8333fcbf933f98a0d14d6594cac4b9529e4bbc96550c5578\": container with ID starting with 880b3909e293946d8333fcbf933f98a0d14d6594cac4b9529e4bbc96550c5578 not found: ID does not exist" containerID="880b3909e293946d8333fcbf933f98a0d14d6594cac4b9529e4bbc96550c5578" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.222108 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"880b3909e293946d8333fcbf933f98a0d14d6594cac4b9529e4bbc96550c5578"} err="failed to get container status \"880b3909e293946d8333fcbf933f98a0d14d6594cac4b9529e4bbc96550c5578\": rpc error: code = NotFound desc = could not find container \"880b3909e293946d8333fcbf933f98a0d14d6594cac4b9529e4bbc96550c5578\": container with ID starting with 880b3909e293946d8333fcbf933f98a0d14d6594cac4b9529e4bbc96550c5578 not found: ID does not exist" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.271645 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sqxp\" (UniqueName: \"kubernetes.io/projected/d6b10458-86e7-4568-b3b5-2a3e090b90a8-kube-api-access-5sqxp\") pod \"glance-default-internal-api-0\" (UID: \"d6b10458-86e7-4568-b3b5-2a3e090b90a8\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.272150 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b10458-86e7-4568-b3b5-2a3e090b90a8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d6b10458-86e7-4568-b3b5-2a3e090b90a8\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.272204 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6b10458-86e7-4568-b3b5-2a3e090b90a8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d6b10458-86e7-4568-b3b5-2a3e090b90a8\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.272231 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d6b10458-86e7-4568-b3b5-2a3e090b90a8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d6b10458-86e7-4568-b3b5-2a3e090b90a8\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.272272 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6b10458-86e7-4568-b3b5-2a3e090b90a8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d6b10458-86e7-4568-b3b5-2a3e090b90a8\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.272836 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6b10458-86e7-4568-b3b5-2a3e090b90a8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d6b10458-86e7-4568-b3b5-2a3e090b90a8\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.273012 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"d6b10458-86e7-4568-b3b5-2a3e090b90a8\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.273146 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6b10458-86e7-4568-b3b5-2a3e090b90a8-logs\") pod \"glance-default-internal-api-0\" (UID: \"d6b10458-86e7-4568-b3b5-2a3e090b90a8\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.375806 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6b10458-86e7-4568-b3b5-2a3e090b90a8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d6b10458-86e7-4568-b3b5-2a3e090b90a8\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.375868 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d6b10458-86e7-4568-b3b5-2a3e090b90a8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d6b10458-86e7-4568-b3b5-2a3e090b90a8\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.375905 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6b10458-86e7-4568-b3b5-2a3e090b90a8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d6b10458-86e7-4568-b3b5-2a3e090b90a8\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.375936 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6b10458-86e7-4568-b3b5-2a3e090b90a8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d6b10458-86e7-4568-b3b5-2a3e090b90a8\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.375981 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"d6b10458-86e7-4568-b3b5-2a3e090b90a8\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.376015 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6b10458-86e7-4568-b3b5-2a3e090b90a8-logs\") pod \"glance-default-internal-api-0\" (UID: \"d6b10458-86e7-4568-b3b5-2a3e090b90a8\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.376068 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sqxp\" (UniqueName: \"kubernetes.io/projected/d6b10458-86e7-4568-b3b5-2a3e090b90a8-kube-api-access-5sqxp\") pod \"glance-default-internal-api-0\" (UID: \"d6b10458-86e7-4568-b3b5-2a3e090b90a8\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.376099 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b10458-86e7-4568-b3b5-2a3e090b90a8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d6b10458-86e7-4568-b3b5-2a3e090b90a8\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.378501 4734 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"d6b10458-86e7-4568-b3b5-2a3e090b90a8\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.381624 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6b10458-86e7-4568-b3b5-2a3e090b90a8-logs\") pod \"glance-default-internal-api-0\" (UID: \"d6b10458-86e7-4568-b3b5-2a3e090b90a8\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.378508 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d6b10458-86e7-4568-b3b5-2a3e090b90a8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d6b10458-86e7-4568-b3b5-2a3e090b90a8\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.386239 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b10458-86e7-4568-b3b5-2a3e090b90a8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d6b10458-86e7-4568-b3b5-2a3e090b90a8\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.389205 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6b10458-86e7-4568-b3b5-2a3e090b90a8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d6b10458-86e7-4568-b3b5-2a3e090b90a8\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.411936 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6b10458-86e7-4568-b3b5-2a3e090b90a8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d6b10458-86e7-4568-b3b5-2a3e090b90a8\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.418297 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6b10458-86e7-4568-b3b5-2a3e090b90a8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d6b10458-86e7-4568-b3b5-2a3e090b90a8\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.426376 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sqxp\" (UniqueName: \"kubernetes.io/projected/d6b10458-86e7-4568-b3b5-2a3e090b90a8-kube-api-access-5sqxp\") pod \"glance-default-internal-api-0\" (UID: \"d6b10458-86e7-4568-b3b5-2a3e090b90a8\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.519231 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"d6b10458-86e7-4568-b3b5-2a3e090b90a8\") " pod="openstack/glance-default-internal-api-0" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.632292 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af968080-3e37-4034-90a7-0b654e68ee89" path="/var/lib/kubelet/pods/af968080-3e37-4034-90a7-0b654e68ee89/volumes" Dec 05 23:39:57 crc kubenswrapper[4734]: I1205 23:39:57.809431 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 23:39:58 crc kubenswrapper[4734]: I1205 23:39:58.104339 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350","Type":"ContainerStarted","Data":"eee0ce98affad252808be83570075bdf43423788a61427e6d7b3f43a9e5f797f"} Dec 05 23:39:58 crc kubenswrapper[4734]: I1205 23:39:58.104744 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" containerName="ceilometer-central-agent" containerID="cri-o://30302b4abf366dfba310850d0125358565c63a7dfc4569e6a155078856267070" gracePeriod=30 Dec 05 23:39:58 crc kubenswrapper[4734]: I1205 23:39:58.105602 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 23:39:58 crc kubenswrapper[4734]: I1205 23:39:58.105697 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" containerName="proxy-httpd" containerID="cri-o://eee0ce98affad252808be83570075bdf43423788a61427e6d7b3f43a9e5f797f" gracePeriod=30 Dec 05 23:39:58 crc kubenswrapper[4734]: I1205 23:39:58.105872 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" containerName="ceilometer-notification-agent" containerID="cri-o://908819d566bd6792c4502a5e6fba71c14d9e6bfa7c5c91dba03ca94e9316d98f" gracePeriod=30 Dec 05 23:39:58 crc kubenswrapper[4734]: I1205 23:39:58.105921 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" containerName="sg-core" containerID="cri-o://d28f8c59e2b624118f759e364781e1afa8a6cdbbf4a691768c964f62c6637a01" gracePeriod=30 Dec 05 23:39:58 crc kubenswrapper[4734]: I1205 23:39:58.126630 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.6826911020000002 podStartE2EDuration="5.126608598s" podCreationTimestamp="2025-12-05 23:39:53 +0000 UTC" firstStartedPulling="2025-12-05 23:39:54.3021227 +0000 UTC m=+1214.985526976" lastFinishedPulling="2025-12-05 23:39:57.746040196 +0000 UTC m=+1218.429444472" observedRunningTime="2025-12-05 23:39:58.126383253 +0000 UTC m=+1218.809787529" watchObservedRunningTime="2025-12-05 23:39:58.126608598 +0000 UTC m=+1218.810012874" Dec 05 23:39:58 crc kubenswrapper[4734]: I1205 23:39:58.498111 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 23:39:58 crc kubenswrapper[4734]: W1205 23:39:58.508098 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6b10458_86e7_4568_b3b5_2a3e090b90a8.slice/crio-ed732ae25a46741f2eddb0a8fc917968be7852c246f323a257fb353f6dd40acd WatchSource:0}: Error finding container ed732ae25a46741f2eddb0a8fc917968be7852c246f323a257fb353f6dd40acd: Status 404 returned error can't find the container with id ed732ae25a46741f2eddb0a8fc917968be7852c246f323a257fb353f6dd40acd Dec 05 23:39:59 crc kubenswrapper[4734]: I1205 23:39:59.126205 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d6b10458-86e7-4568-b3b5-2a3e090b90a8","Type":"ContainerStarted","Data":"ed732ae25a46741f2eddb0a8fc917968be7852c246f323a257fb353f6dd40acd"} Dec 05 23:39:59 crc kubenswrapper[4734]: I1205 23:39:59.131411 4734 generic.go:334] "Generic (PLEG): container finished" podID="f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" containerID="d28f8c59e2b624118f759e364781e1afa8a6cdbbf4a691768c964f62c6637a01" exitCode=2 Dec 05 23:39:59 crc kubenswrapper[4734]: I1205 23:39:59.135742 4734 generic.go:334] "Generic (PLEG): container finished" podID="f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" containerID="908819d566bd6792c4502a5e6fba71c14d9e6bfa7c5c91dba03ca94e9316d98f" exitCode=0 Dec 05 23:39:59 crc kubenswrapper[4734]: I1205 23:39:59.131634 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350","Type":"ContainerDied","Data":"d28f8c59e2b624118f759e364781e1afa8a6cdbbf4a691768c964f62c6637a01"} Dec 05 23:39:59 crc kubenswrapper[4734]: I1205 23:39:59.135847 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350","Type":"ContainerDied","Data":"908819d566bd6792c4502a5e6fba71c14d9e6bfa7c5c91dba03ca94e9316d98f"} Dec 05 23:40:00 crc kubenswrapper[4734]: I1205 23:40:00.147960 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d6b10458-86e7-4568-b3b5-2a3e090b90a8","Type":"ContainerStarted","Data":"b669dcf87c11932fdd04688d70f5e5d653e5af82004687eb2261812fc97292a6"} Dec 05 23:40:00 crc kubenswrapper[4734]: I1205 23:40:00.148971 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d6b10458-86e7-4568-b3b5-2a3e090b90a8","Type":"ContainerStarted","Data":"518df8a1fe9b2262bcf324ba0f0a2c24d4a7fdda4f000bca333e782a44914daa"} Dec 05 23:40:00 crc kubenswrapper[4734]: I1205 23:40:00.187373 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.187345195 podStartE2EDuration="3.187345195s" podCreationTimestamp="2025-12-05 23:39:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:40:00.16936379 +0000 UTC m=+1220.852768066" watchObservedRunningTime="2025-12-05 23:40:00.187345195 +0000 UTC m=+1220.870749471" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.110763 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.173825 4734 generic.go:334] "Generic (PLEG): container finished" podID="05a4de5c-b10c-4d66-b3dd-0468357229b0" containerID="6fb7398bf31d93c15ed3a4e6788714ab58c7c73669d5c0a4145e85018ce10e05" exitCode=0 Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.175233 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"05a4de5c-b10c-4d66-b3dd-0468357229b0","Type":"ContainerDied","Data":"6fb7398bf31d93c15ed3a4e6788714ab58c7c73669d5c0a4145e85018ce10e05"} Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.175339 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"05a4de5c-b10c-4d66-b3dd-0468357229b0","Type":"ContainerDied","Data":"3a448d7b8af7253e02b0d41664aa248ec5d29f3d1038d90d6a30c0cdbe378b80"} Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.175365 4734 scope.go:117] "RemoveContainer" containerID="6fb7398bf31d93c15ed3a4e6788714ab58c7c73669d5c0a4145e85018ce10e05" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.175282 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.199536 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a4de5c-b10c-4d66-b3dd-0468357229b0-config-data\") pod \"05a4de5c-b10c-4d66-b3dd-0468357229b0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.199608 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05a4de5c-b10c-4d66-b3dd-0468357229b0-scripts\") pod \"05a4de5c-b10c-4d66-b3dd-0468357229b0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.199714 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05a4de5c-b10c-4d66-b3dd-0468357229b0-public-tls-certs\") pod \"05a4de5c-b10c-4d66-b3dd-0468357229b0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.199805 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"05a4de5c-b10c-4d66-b3dd-0468357229b0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.199882 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05a4de5c-b10c-4d66-b3dd-0468357229b0-httpd-run\") pod \"05a4de5c-b10c-4d66-b3dd-0468357229b0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.199979 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05a4de5c-b10c-4d66-b3dd-0468357229b0-logs\") pod \"05a4de5c-b10c-4d66-b3dd-0468357229b0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.200057 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq2zb\" (UniqueName: \"kubernetes.io/projected/05a4de5c-b10c-4d66-b3dd-0468357229b0-kube-api-access-bq2zb\") pod \"05a4de5c-b10c-4d66-b3dd-0468357229b0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.200102 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a4de5c-b10c-4d66-b3dd-0468357229b0-combined-ca-bundle\") pod \"05a4de5c-b10c-4d66-b3dd-0468357229b0\" (UID: \"05a4de5c-b10c-4d66-b3dd-0468357229b0\") " Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.204335 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05a4de5c-b10c-4d66-b3dd-0468357229b0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "05a4de5c-b10c-4d66-b3dd-0468357229b0" (UID: "05a4de5c-b10c-4d66-b3dd-0468357229b0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.204734 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05a4de5c-b10c-4d66-b3dd-0468357229b0-logs" (OuterVolumeSpecName: "logs") pod "05a4de5c-b10c-4d66-b3dd-0468357229b0" (UID: "05a4de5c-b10c-4d66-b3dd-0468357229b0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.210775 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05a4de5c-b10c-4d66-b3dd-0468357229b0-scripts" (OuterVolumeSpecName: "scripts") pod "05a4de5c-b10c-4d66-b3dd-0468357229b0" (UID: "05a4de5c-b10c-4d66-b3dd-0468357229b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.217654 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "05a4de5c-b10c-4d66-b3dd-0468357229b0" (UID: "05a4de5c-b10c-4d66-b3dd-0468357229b0"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.217691 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05a4de5c-b10c-4d66-b3dd-0468357229b0-kube-api-access-bq2zb" (OuterVolumeSpecName: "kube-api-access-bq2zb") pod "05a4de5c-b10c-4d66-b3dd-0468357229b0" (UID: "05a4de5c-b10c-4d66-b3dd-0468357229b0"). InnerVolumeSpecName "kube-api-access-bq2zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.234138 4734 scope.go:117] "RemoveContainer" containerID="caa15c8be00b0cd026dbf85abad91cee75f3e43c982a4959a482b572e2817c84" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.274992 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05a4de5c-b10c-4d66-b3dd-0468357229b0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "05a4de5c-b10c-4d66-b3dd-0468357229b0" (UID: "05a4de5c-b10c-4d66-b3dd-0468357229b0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.279739 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05a4de5c-b10c-4d66-b3dd-0468357229b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05a4de5c-b10c-4d66-b3dd-0468357229b0" (UID: "05a4de5c-b10c-4d66-b3dd-0468357229b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.302997 4734 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05a4de5c-b10c-4d66-b3dd-0468357229b0-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.303039 4734 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05a4de5c-b10c-4d66-b3dd-0468357229b0-logs\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.303051 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bq2zb\" (UniqueName: \"kubernetes.io/projected/05a4de5c-b10c-4d66-b3dd-0468357229b0-kube-api-access-bq2zb\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.303063 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a4de5c-b10c-4d66-b3dd-0468357229b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.303074 4734 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05a4de5c-b10c-4d66-b3dd-0468357229b0-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.303084 4734 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05a4de5c-b10c-4d66-b3dd-0468357229b0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.303298 4734 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.326619 4734 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.331633 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05a4de5c-b10c-4d66-b3dd-0468357229b0-config-data" (OuterVolumeSpecName: "config-data") pod "05a4de5c-b10c-4d66-b3dd-0468357229b0" (UID: "05a4de5c-b10c-4d66-b3dd-0468357229b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.368824 4734 scope.go:117] "RemoveContainer" containerID="6fb7398bf31d93c15ed3a4e6788714ab58c7c73669d5c0a4145e85018ce10e05" Dec 05 23:40:01 crc kubenswrapper[4734]: E1205 23:40:01.369551 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fb7398bf31d93c15ed3a4e6788714ab58c7c73669d5c0a4145e85018ce10e05\": container with ID starting with 6fb7398bf31d93c15ed3a4e6788714ab58c7c73669d5c0a4145e85018ce10e05 not found: ID does not exist" containerID="6fb7398bf31d93c15ed3a4e6788714ab58c7c73669d5c0a4145e85018ce10e05" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.369615 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fb7398bf31d93c15ed3a4e6788714ab58c7c73669d5c0a4145e85018ce10e05"} err="failed to get container status \"6fb7398bf31d93c15ed3a4e6788714ab58c7c73669d5c0a4145e85018ce10e05\": rpc error: code = NotFound desc = could not find container \"6fb7398bf31d93c15ed3a4e6788714ab58c7c73669d5c0a4145e85018ce10e05\": container with ID starting with 6fb7398bf31d93c15ed3a4e6788714ab58c7c73669d5c0a4145e85018ce10e05 not found: ID does not exist" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.369650 4734 scope.go:117] "RemoveContainer" containerID="caa15c8be00b0cd026dbf85abad91cee75f3e43c982a4959a482b572e2817c84" Dec 05 23:40:01 crc kubenswrapper[4734]: E1205 23:40:01.370269 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caa15c8be00b0cd026dbf85abad91cee75f3e43c982a4959a482b572e2817c84\": container with ID starting with caa15c8be00b0cd026dbf85abad91cee75f3e43c982a4959a482b572e2817c84 not found: ID does not exist" containerID="caa15c8be00b0cd026dbf85abad91cee75f3e43c982a4959a482b572e2817c84" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.370331 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caa15c8be00b0cd026dbf85abad91cee75f3e43c982a4959a482b572e2817c84"} err="failed to get container status \"caa15c8be00b0cd026dbf85abad91cee75f3e43c982a4959a482b572e2817c84\": rpc error: code = NotFound desc = could not find container \"caa15c8be00b0cd026dbf85abad91cee75f3e43c982a4959a482b572e2817c84\": container with ID starting with caa15c8be00b0cd026dbf85abad91cee75f3e43c982a4959a482b572e2817c84 not found: ID does not exist" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.405892 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a4de5c-b10c-4d66-b3dd-0468357229b0-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.405943 4734 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.516826 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.527559 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.550853 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 23:40:01 crc kubenswrapper[4734]: E1205 23:40:01.551426 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a4de5c-b10c-4d66-b3dd-0468357229b0" containerName="glance-log" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.551457 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a4de5c-b10c-4d66-b3dd-0468357229b0" containerName="glance-log" Dec 05 23:40:01 crc kubenswrapper[4734]: E1205 23:40:01.551492 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a4de5c-b10c-4d66-b3dd-0468357229b0" containerName="glance-httpd" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.551501 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a4de5c-b10c-4d66-b3dd-0468357229b0" containerName="glance-httpd" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.551810 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a4de5c-b10c-4d66-b3dd-0468357229b0" containerName="glance-httpd" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.551852 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a4de5c-b10c-4d66-b3dd-0468357229b0" containerName="glance-log" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.553148 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.560023 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.560109 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.572459 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.629167 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05a4de5c-b10c-4d66-b3dd-0468357229b0" path="/var/lib/kubelet/pods/05a4de5c-b10c-4d66-b3dd-0468357229b0/volumes" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.713336 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd48s\" (UniqueName: \"kubernetes.io/projected/eddf1584-198a-4279-a09a-30500f1842f3-kube-api-access-kd48s\") pod \"glance-default-external-api-0\" (UID: \"eddf1584-198a-4279-a09a-30500f1842f3\") " pod="openstack/glance-default-external-api-0" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.713432 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eddf1584-198a-4279-a09a-30500f1842f3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eddf1584-198a-4279-a09a-30500f1842f3\") " pod="openstack/glance-default-external-api-0" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.713686 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eddf1584-198a-4279-a09a-30500f1842f3-scripts\") pod \"glance-default-external-api-0\" (UID: \"eddf1584-198a-4279-a09a-30500f1842f3\") " pod="openstack/glance-default-external-api-0" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.713789 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eddf1584-198a-4279-a09a-30500f1842f3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"eddf1584-198a-4279-a09a-30500f1842f3\") " pod="openstack/glance-default-external-api-0" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.713972 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"eddf1584-198a-4279-a09a-30500f1842f3\") " pod="openstack/glance-default-external-api-0" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.714119 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eddf1584-198a-4279-a09a-30500f1842f3-logs\") pod \"glance-default-external-api-0\" (UID: \"eddf1584-198a-4279-a09a-30500f1842f3\") " pod="openstack/glance-default-external-api-0" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.714192 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eddf1584-198a-4279-a09a-30500f1842f3-config-data\") pod \"glance-default-external-api-0\" (UID: \"eddf1584-198a-4279-a09a-30500f1842f3\") " pod="openstack/glance-default-external-api-0" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.714432 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eddf1584-198a-4279-a09a-30500f1842f3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eddf1584-198a-4279-a09a-30500f1842f3\") " pod="openstack/glance-default-external-api-0" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.816893 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eddf1584-198a-4279-a09a-30500f1842f3-logs\") pod \"glance-default-external-api-0\" (UID: \"eddf1584-198a-4279-a09a-30500f1842f3\") " pod="openstack/glance-default-external-api-0" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.817014 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eddf1584-198a-4279-a09a-30500f1842f3-config-data\") pod \"glance-default-external-api-0\" (UID: \"eddf1584-198a-4279-a09a-30500f1842f3\") " pod="openstack/glance-default-external-api-0" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.817589 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eddf1584-198a-4279-a09a-30500f1842f3-logs\") pod \"glance-default-external-api-0\" (UID: \"eddf1584-198a-4279-a09a-30500f1842f3\") " pod="openstack/glance-default-external-api-0" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.818099 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eddf1584-198a-4279-a09a-30500f1842f3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eddf1584-198a-4279-a09a-30500f1842f3\") " pod="openstack/glance-default-external-api-0" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.818178 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd48s\" (UniqueName: \"kubernetes.io/projected/eddf1584-198a-4279-a09a-30500f1842f3-kube-api-access-kd48s\") pod \"glance-default-external-api-0\" (UID: \"eddf1584-198a-4279-a09a-30500f1842f3\") " pod="openstack/glance-default-external-api-0" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.818321 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eddf1584-198a-4279-a09a-30500f1842f3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eddf1584-198a-4279-a09a-30500f1842f3\") " pod="openstack/glance-default-external-api-0" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.818428 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eddf1584-198a-4279-a09a-30500f1842f3-scripts\") pod \"glance-default-external-api-0\" (UID: \"eddf1584-198a-4279-a09a-30500f1842f3\") " pod="openstack/glance-default-external-api-0" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.818448 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eddf1584-198a-4279-a09a-30500f1842f3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eddf1584-198a-4279-a09a-30500f1842f3\") " pod="openstack/glance-default-external-api-0" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.818483 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eddf1584-198a-4279-a09a-30500f1842f3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"eddf1584-198a-4279-a09a-30500f1842f3\") " pod="openstack/glance-default-external-api-0" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.818611 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"eddf1584-198a-4279-a09a-30500f1842f3\") " pod="openstack/glance-default-external-api-0" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.818837 4734 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"eddf1584-198a-4279-a09a-30500f1842f3\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.827397 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eddf1584-198a-4279-a09a-30500f1842f3-scripts\") pod \"glance-default-external-api-0\" (UID: \"eddf1584-198a-4279-a09a-30500f1842f3\") " pod="openstack/glance-default-external-api-0" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.827444 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eddf1584-198a-4279-a09a-30500f1842f3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"eddf1584-198a-4279-a09a-30500f1842f3\") " pod="openstack/glance-default-external-api-0" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.827499 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eddf1584-198a-4279-a09a-30500f1842f3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eddf1584-198a-4279-a09a-30500f1842f3\") " pod="openstack/glance-default-external-api-0" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.828009 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eddf1584-198a-4279-a09a-30500f1842f3-config-data\") pod \"glance-default-external-api-0\" (UID: \"eddf1584-198a-4279-a09a-30500f1842f3\") " pod="openstack/glance-default-external-api-0" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.842951 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd48s\" (UniqueName: \"kubernetes.io/projected/eddf1584-198a-4279-a09a-30500f1842f3-kube-api-access-kd48s\") pod \"glance-default-external-api-0\" (UID: \"eddf1584-198a-4279-a09a-30500f1842f3\") " pod="openstack/glance-default-external-api-0" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.858210 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"eddf1584-198a-4279-a09a-30500f1842f3\") " pod="openstack/glance-default-external-api-0" Dec 05 23:40:01 crc kubenswrapper[4734]: I1205 23:40:01.886824 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 23:40:02 crc kubenswrapper[4734]: I1205 23:40:02.511711 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 23:40:03 crc kubenswrapper[4734]: I1205 23:40:03.210729 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eddf1584-198a-4279-a09a-30500f1842f3","Type":"ContainerStarted","Data":"f7902a0971bfc2ba7260b52f544a7757e2107782a6a54aa2e63e94ac17142644"} Dec 05 23:40:04 crc kubenswrapper[4734]: I1205 23:40:04.230990 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eddf1584-198a-4279-a09a-30500f1842f3","Type":"ContainerStarted","Data":"f123cb70a522f8dbc80240ea481cc015d89afbc7514d26015ab7d36b3d76a0e6"} Dec 05 23:40:04 crc kubenswrapper[4734]: I1205 23:40:04.231986 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eddf1584-198a-4279-a09a-30500f1842f3","Type":"ContainerStarted","Data":"5ec0577ef5c90c200609caf1fa54666ea30c069b4000446d83b6826564b32a2b"} Dec 05 23:40:04 crc kubenswrapper[4734]: I1205 23:40:04.270649 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.270625833 podStartE2EDuration="3.270625833s" podCreationTimestamp="2025-12-05 23:40:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:40:04.260779905 +0000 UTC m=+1224.944184181" watchObservedRunningTime="2025-12-05 23:40:04.270625833 +0000 UTC m=+1224.954030109" Dec 05 23:40:07 crc kubenswrapper[4734]: I1205 23:40:07.809873 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 23:40:07 crc kubenswrapper[4734]: I1205 23:40:07.810878 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 23:40:07 crc kubenswrapper[4734]: I1205 23:40:07.848290 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 23:40:07 crc kubenswrapper[4734]: I1205 23:40:07.861806 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.141616 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-4fmf5"] Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.143149 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4fmf5" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.198042 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4fmf5"] Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.281799 4734 generic.go:334] "Generic (PLEG): container finished" podID="f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" containerID="30302b4abf366dfba310850d0125358565c63a7dfc4569e6a155078856267070" exitCode=0 Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.282084 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350","Type":"ContainerDied","Data":"30302b4abf366dfba310850d0125358565c63a7dfc4569e6a155078856267070"} Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.283731 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.284643 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.285433 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b88ba6a0-1e12-4bd6-a483-b2522fad58f9-operator-scripts\") pod \"nova-api-db-create-4fmf5\" (UID: \"b88ba6a0-1e12-4bd6-a483-b2522fad58f9\") " pod="openstack/nova-api-db-create-4fmf5" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.285506 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm2jq\" (UniqueName: \"kubernetes.io/projected/b88ba6a0-1e12-4bd6-a483-b2522fad58f9-kube-api-access-lm2jq\") pod \"nova-api-db-create-4fmf5\" (UID: \"b88ba6a0-1e12-4bd6-a483-b2522fad58f9\") " pod="openstack/nova-api-db-create-4fmf5" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.299622 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-qrbf6"] Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.301600 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qrbf6" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.306809 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qrbf6"] Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.375343 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-jdhd2"] Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.376932 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jdhd2" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.388956 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8v7w\" (UniqueName: \"kubernetes.io/projected/caa5035d-868a-4c3a-bb3e-43f7b84096f4-kube-api-access-q8v7w\") pod \"nova-cell0-db-create-qrbf6\" (UID: \"caa5035d-868a-4c3a-bb3e-43f7b84096f4\") " pod="openstack/nova-cell0-db-create-qrbf6" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.389192 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b88ba6a0-1e12-4bd6-a483-b2522fad58f9-operator-scripts\") pod \"nova-api-db-create-4fmf5\" (UID: \"b88ba6a0-1e12-4bd6-a483-b2522fad58f9\") " pod="openstack/nova-api-db-create-4fmf5" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.389275 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caa5035d-868a-4c3a-bb3e-43f7b84096f4-operator-scripts\") pod \"nova-cell0-db-create-qrbf6\" (UID: \"caa5035d-868a-4c3a-bb3e-43f7b84096f4\") " pod="openstack/nova-cell0-db-create-qrbf6" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.389323 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm2jq\" (UniqueName: \"kubernetes.io/projected/b88ba6a0-1e12-4bd6-a483-b2522fad58f9-kube-api-access-lm2jq\") pod \"nova-api-db-create-4fmf5\" (UID: \"b88ba6a0-1e12-4bd6-a483-b2522fad58f9\") " pod="openstack/nova-api-db-create-4fmf5" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.390343 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b88ba6a0-1e12-4bd6-a483-b2522fad58f9-operator-scripts\") pod \"nova-api-db-create-4fmf5\" (UID: \"b88ba6a0-1e12-4bd6-a483-b2522fad58f9\") " pod="openstack/nova-api-db-create-4fmf5" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.393774 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-3f13-account-create-update-p4zg7"] Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.398027 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3f13-account-create-update-p4zg7" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.403851 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.408994 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-jdhd2"] Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.432782 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-3f13-account-create-update-p4zg7"] Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.435446 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm2jq\" (UniqueName: \"kubernetes.io/projected/b88ba6a0-1e12-4bd6-a483-b2522fad58f9-kube-api-access-lm2jq\") pod \"nova-api-db-create-4fmf5\" (UID: \"b88ba6a0-1e12-4bd6-a483-b2522fad58f9\") " pod="openstack/nova-api-db-create-4fmf5" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.492975 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2bc9278-e755-43b5-8fb5-91854e437360-operator-scripts\") pod \"nova-api-3f13-account-create-update-p4zg7\" (UID: \"b2bc9278-e755-43b5-8fb5-91854e437360\") " pod="openstack/nova-api-3f13-account-create-update-p4zg7" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.493049 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caa5035d-868a-4c3a-bb3e-43f7b84096f4-operator-scripts\") pod \"nova-cell0-db-create-qrbf6\" (UID: \"caa5035d-868a-4c3a-bb3e-43f7b84096f4\") " pod="openstack/nova-cell0-db-create-qrbf6" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.493150 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8v7w\" (UniqueName: \"kubernetes.io/projected/caa5035d-868a-4c3a-bb3e-43f7b84096f4-kube-api-access-q8v7w\") pod \"nova-cell0-db-create-qrbf6\" (UID: \"caa5035d-868a-4c3a-bb3e-43f7b84096f4\") " pod="openstack/nova-cell0-db-create-qrbf6" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.493174 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0-operator-scripts\") pod \"nova-cell1-db-create-jdhd2\" (UID: \"e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0\") " pod="openstack/nova-cell1-db-create-jdhd2" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.493202 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4w2k\" (UniqueName: \"kubernetes.io/projected/e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0-kube-api-access-p4w2k\") pod \"nova-cell1-db-create-jdhd2\" (UID: \"e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0\") " pod="openstack/nova-cell1-db-create-jdhd2" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.493236 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9blpj\" (UniqueName: \"kubernetes.io/projected/b2bc9278-e755-43b5-8fb5-91854e437360-kube-api-access-9blpj\") pod \"nova-api-3f13-account-create-update-p4zg7\" (UID: \"b2bc9278-e755-43b5-8fb5-91854e437360\") " pod="openstack/nova-api-3f13-account-create-update-p4zg7" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.494062 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caa5035d-868a-4c3a-bb3e-43f7b84096f4-operator-scripts\") pod \"nova-cell0-db-create-qrbf6\" (UID: \"caa5035d-868a-4c3a-bb3e-43f7b84096f4\") " pod="openstack/nova-cell0-db-create-qrbf6" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.494582 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4fmf5" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.532086 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8v7w\" (UniqueName: \"kubernetes.io/projected/caa5035d-868a-4c3a-bb3e-43f7b84096f4-kube-api-access-q8v7w\") pod \"nova-cell0-db-create-qrbf6\" (UID: \"caa5035d-868a-4c3a-bb3e-43f7b84096f4\") " pod="openstack/nova-cell0-db-create-qrbf6" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.557715 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-d01a-account-create-update-4smb4"] Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.560052 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d01a-account-create-update-4smb4" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.565089 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.596709 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2bc9278-e755-43b5-8fb5-91854e437360-operator-scripts\") pod \"nova-api-3f13-account-create-update-p4zg7\" (UID: \"b2bc9278-e755-43b5-8fb5-91854e437360\") " pod="openstack/nova-api-3f13-account-create-update-p4zg7" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.596840 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0-operator-scripts\") pod \"nova-cell1-db-create-jdhd2\" (UID: \"e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0\") " pod="openstack/nova-cell1-db-create-jdhd2" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.596862 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4w2k\" (UniqueName: \"kubernetes.io/projected/e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0-kube-api-access-p4w2k\") pod \"nova-cell1-db-create-jdhd2\" (UID: \"e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0\") " pod="openstack/nova-cell1-db-create-jdhd2" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.596910 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9blpj\" (UniqueName: \"kubernetes.io/projected/b2bc9278-e755-43b5-8fb5-91854e437360-kube-api-access-9blpj\") pod \"nova-api-3f13-account-create-update-p4zg7\" (UID: \"b2bc9278-e755-43b5-8fb5-91854e437360\") " pod="openstack/nova-api-3f13-account-create-update-p4zg7" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.598111 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2bc9278-e755-43b5-8fb5-91854e437360-operator-scripts\") pod \"nova-api-3f13-account-create-update-p4zg7\" (UID: \"b2bc9278-e755-43b5-8fb5-91854e437360\") " pod="openstack/nova-api-3f13-account-create-update-p4zg7" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.598666 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0-operator-scripts\") pod \"nova-cell1-db-create-jdhd2\" (UID: \"e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0\") " pod="openstack/nova-cell1-db-create-jdhd2" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.615160 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d01a-account-create-update-4smb4"] Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.625630 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9blpj\" (UniqueName: \"kubernetes.io/projected/b2bc9278-e755-43b5-8fb5-91854e437360-kube-api-access-9blpj\") pod \"nova-api-3f13-account-create-update-p4zg7\" (UID: \"b2bc9278-e755-43b5-8fb5-91854e437360\") " pod="openstack/nova-api-3f13-account-create-update-p4zg7" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.634192 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4w2k\" (UniqueName: \"kubernetes.io/projected/e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0-kube-api-access-p4w2k\") pod \"nova-cell1-db-create-jdhd2\" (UID: \"e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0\") " pod="openstack/nova-cell1-db-create-jdhd2" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.639225 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qrbf6" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.699829 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f57pd\" (UniqueName: \"kubernetes.io/projected/52e79363-534e-4d0c-9cdf-86ad75fa19bb-kube-api-access-f57pd\") pod \"nova-cell0-d01a-account-create-update-4smb4\" (UID: \"52e79363-534e-4d0c-9cdf-86ad75fa19bb\") " pod="openstack/nova-cell0-d01a-account-create-update-4smb4" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.700454 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52e79363-534e-4d0c-9cdf-86ad75fa19bb-operator-scripts\") pod \"nova-cell0-d01a-account-create-update-4smb4\" (UID: \"52e79363-534e-4d0c-9cdf-86ad75fa19bb\") " pod="openstack/nova-cell0-d01a-account-create-update-4smb4" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.707023 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jdhd2" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.780099 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-53ed-account-create-update-8tnq8"] Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.781575 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-53ed-account-create-update-8tnq8" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.784049 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.802208 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3f13-account-create-update-p4zg7" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.803503 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f57pd\" (UniqueName: \"kubernetes.io/projected/52e79363-534e-4d0c-9cdf-86ad75fa19bb-kube-api-access-f57pd\") pod \"nova-cell0-d01a-account-create-update-4smb4\" (UID: \"52e79363-534e-4d0c-9cdf-86ad75fa19bb\") " pod="openstack/nova-cell0-d01a-account-create-update-4smb4" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.803641 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52e79363-534e-4d0c-9cdf-86ad75fa19bb-operator-scripts\") pod \"nova-cell0-d01a-account-create-update-4smb4\" (UID: \"52e79363-534e-4d0c-9cdf-86ad75fa19bb\") " pod="openstack/nova-cell0-d01a-account-create-update-4smb4" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.804408 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52e79363-534e-4d0c-9cdf-86ad75fa19bb-operator-scripts\") pod \"nova-cell0-d01a-account-create-update-4smb4\" (UID: \"52e79363-534e-4d0c-9cdf-86ad75fa19bb\") " pod="openstack/nova-cell0-d01a-account-create-update-4smb4" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.806036 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-53ed-account-create-update-8tnq8"] Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.841783 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f57pd\" (UniqueName: \"kubernetes.io/projected/52e79363-534e-4d0c-9cdf-86ad75fa19bb-kube-api-access-f57pd\") pod \"nova-cell0-d01a-account-create-update-4smb4\" (UID: \"52e79363-534e-4d0c-9cdf-86ad75fa19bb\") " pod="openstack/nova-cell0-d01a-account-create-update-4smb4" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.907641 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9rmr\" (UniqueName: \"kubernetes.io/projected/1148161b-fe59-434d-880a-80a03b0c8ff7-kube-api-access-c9rmr\") pod \"nova-cell1-53ed-account-create-update-8tnq8\" (UID: \"1148161b-fe59-434d-880a-80a03b0c8ff7\") " pod="openstack/nova-cell1-53ed-account-create-update-8tnq8" Dec 05 23:40:08 crc kubenswrapper[4734]: I1205 23:40:08.907708 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1148161b-fe59-434d-880a-80a03b0c8ff7-operator-scripts\") pod \"nova-cell1-53ed-account-create-update-8tnq8\" (UID: \"1148161b-fe59-434d-880a-80a03b0c8ff7\") " pod="openstack/nova-cell1-53ed-account-create-update-8tnq8" Dec 05 23:40:09 crc kubenswrapper[4734]: I1205 23:40:09.004797 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d01a-account-create-update-4smb4" Dec 05 23:40:09 crc kubenswrapper[4734]: I1205 23:40:09.016245 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9rmr\" (UniqueName: \"kubernetes.io/projected/1148161b-fe59-434d-880a-80a03b0c8ff7-kube-api-access-c9rmr\") pod \"nova-cell1-53ed-account-create-update-8tnq8\" (UID: \"1148161b-fe59-434d-880a-80a03b0c8ff7\") " pod="openstack/nova-cell1-53ed-account-create-update-8tnq8" Dec 05 23:40:09 crc kubenswrapper[4734]: I1205 23:40:09.016345 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1148161b-fe59-434d-880a-80a03b0c8ff7-operator-scripts\") pod \"nova-cell1-53ed-account-create-update-8tnq8\" (UID: \"1148161b-fe59-434d-880a-80a03b0c8ff7\") " pod="openstack/nova-cell1-53ed-account-create-update-8tnq8" Dec 05 23:40:09 crc kubenswrapper[4734]: I1205 23:40:09.017611 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1148161b-fe59-434d-880a-80a03b0c8ff7-operator-scripts\") pod \"nova-cell1-53ed-account-create-update-8tnq8\" (UID: \"1148161b-fe59-434d-880a-80a03b0c8ff7\") " pod="openstack/nova-cell1-53ed-account-create-update-8tnq8" Dec 05 23:40:09 crc kubenswrapper[4734]: I1205 23:40:09.070238 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9rmr\" (UniqueName: \"kubernetes.io/projected/1148161b-fe59-434d-880a-80a03b0c8ff7-kube-api-access-c9rmr\") pod \"nova-cell1-53ed-account-create-update-8tnq8\" (UID: \"1148161b-fe59-434d-880a-80a03b0c8ff7\") " pod="openstack/nova-cell1-53ed-account-create-update-8tnq8" Dec 05 23:40:09 crc kubenswrapper[4734]: I1205 23:40:09.189473 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4fmf5"] Dec 05 23:40:09 crc kubenswrapper[4734]: I1205 23:40:09.278467 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-53ed-account-create-update-8tnq8" Dec 05 23:40:09 crc kubenswrapper[4734]: I1205 23:40:09.323635 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4fmf5" event={"ID":"b88ba6a0-1e12-4bd6-a483-b2522fad58f9","Type":"ContainerStarted","Data":"305ea6c9b2ebf1165bb8e1242bf6c440d42b6412f50149f842b6d1bcb77994bb"} Dec 05 23:40:09 crc kubenswrapper[4734]: W1205 23:40:09.549678 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9ac3e25_8dd3_45d6_9899_66ad5fb9f9d0.slice/crio-e852baefc08b26b38557db0b52b569f65d4229a03870bb9c8f0f53e0d01b811d WatchSource:0}: Error finding container e852baefc08b26b38557db0b52b569f65d4229a03870bb9c8f0f53e0d01b811d: Status 404 returned error can't find the container with id e852baefc08b26b38557db0b52b569f65d4229a03870bb9c8f0f53e0d01b811d Dec 05 23:40:09 crc kubenswrapper[4734]: I1205 23:40:09.551784 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-jdhd2"] Dec 05 23:40:09 crc kubenswrapper[4734]: I1205 23:40:09.671299 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-3f13-account-create-update-p4zg7"] Dec 05 23:40:09 crc kubenswrapper[4734]: I1205 23:40:09.691136 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qrbf6"] Dec 05 23:40:09 crc kubenswrapper[4734]: I1205 23:40:09.866756 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d01a-account-create-update-4smb4"] Dec 05 23:40:09 crc kubenswrapper[4734]: I1205 23:40:09.951587 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-53ed-account-create-update-8tnq8"] Dec 05 23:40:09 crc kubenswrapper[4734]: W1205 23:40:09.993525 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1148161b_fe59_434d_880a_80a03b0c8ff7.slice/crio-c538f61c2307f43d436e38c276e95067365019a48fc87249cb5c631b420c3ff8 WatchSource:0}: Error finding container c538f61c2307f43d436e38c276e95067365019a48fc87249cb5c631b420c3ff8: Status 404 returned error can't find the container with id c538f61c2307f43d436e38c276e95067365019a48fc87249cb5c631b420c3ff8 Dec 05 23:40:10 crc kubenswrapper[4734]: I1205 23:40:10.337096 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d01a-account-create-update-4smb4" event={"ID":"52e79363-534e-4d0c-9cdf-86ad75fa19bb","Type":"ContainerStarted","Data":"b5fa8e702327ac8104b026a37daec74b6de61880d0a21c519f6afc9c0dc13f7b"} Dec 05 23:40:10 crc kubenswrapper[4734]: I1205 23:40:10.340742 4734 generic.go:334] "Generic (PLEG): container finished" podID="e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0" containerID="125cdc92a0a6a72d59baae612885bbd611cd97fad90627d46778a3cec2076ab4" exitCode=0 Dec 05 23:40:10 crc kubenswrapper[4734]: I1205 23:40:10.340806 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jdhd2" event={"ID":"e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0","Type":"ContainerDied","Data":"125cdc92a0a6a72d59baae612885bbd611cd97fad90627d46778a3cec2076ab4"} Dec 05 23:40:10 crc kubenswrapper[4734]: I1205 23:40:10.340865 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jdhd2" event={"ID":"e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0","Type":"ContainerStarted","Data":"e852baefc08b26b38557db0b52b569f65d4229a03870bb9c8f0f53e0d01b811d"} Dec 05 23:40:10 crc kubenswrapper[4734]: I1205 23:40:10.346033 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3f13-account-create-update-p4zg7" event={"ID":"b2bc9278-e755-43b5-8fb5-91854e437360","Type":"ContainerStarted","Data":"ff2091829a05a30ccb88eb7084e7999fd774c9be43e57361cdcf75d73a9f7e1f"} Dec 05 23:40:10 crc kubenswrapper[4734]: I1205 23:40:10.346086 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3f13-account-create-update-p4zg7" event={"ID":"b2bc9278-e755-43b5-8fb5-91854e437360","Type":"ContainerStarted","Data":"ee1b989327170480c047c175ff77bc9a7168f72c06c352f1f8f5ac50fd9389a3"} Dec 05 23:40:10 crc kubenswrapper[4734]: I1205 23:40:10.350576 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qrbf6" event={"ID":"caa5035d-868a-4c3a-bb3e-43f7b84096f4","Type":"ContainerStarted","Data":"c3b7c1a85483199bc2a55bf39975b9a54b177b732d4c8fffea23530ddd697623"} Dec 05 23:40:10 crc kubenswrapper[4734]: I1205 23:40:10.353751 4734 generic.go:334] "Generic (PLEG): container finished" podID="b88ba6a0-1e12-4bd6-a483-b2522fad58f9" containerID="290b0639851f62b73a9b551489b2e7b6df3784c8b0c0f231090f1cf4238d4feb" exitCode=0 Dec 05 23:40:10 crc kubenswrapper[4734]: I1205 23:40:10.353811 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4fmf5" event={"ID":"b88ba6a0-1e12-4bd6-a483-b2522fad58f9","Type":"ContainerDied","Data":"290b0639851f62b73a9b551489b2e7b6df3784c8b0c0f231090f1cf4238d4feb"} Dec 05 23:40:10 crc kubenswrapper[4734]: I1205 23:40:10.361082 4734 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 23:40:10 crc kubenswrapper[4734]: I1205 23:40:10.361109 4734 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 23:40:10 crc kubenswrapper[4734]: I1205 23:40:10.362175 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-53ed-account-create-update-8tnq8" event={"ID":"1148161b-fe59-434d-880a-80a03b0c8ff7","Type":"ContainerStarted","Data":"c538f61c2307f43d436e38c276e95067365019a48fc87249cb5c631b420c3ff8"} Dec 05 23:40:10 crc kubenswrapper[4734]: I1205 23:40:10.462479 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-3f13-account-create-update-p4zg7" podStartSLOduration=2.4624556269999998 podStartE2EDuration="2.462455627s" podCreationTimestamp="2025-12-05 23:40:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:40:10.447668729 +0000 UTC m=+1231.131073005" watchObservedRunningTime="2025-12-05 23:40:10.462455627 +0000 UTC m=+1231.145859903" Dec 05 23:40:11 crc kubenswrapper[4734]: E1205 23:40:11.037783 4734 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2bc9278_e755_43b5_8fb5_91854e437360.slice/crio-ff2091829a05a30ccb88eb7084e7999fd774c9be43e57361cdcf75d73a9f7e1f.scope\": RecentStats: unable to find data in memory cache]" Dec 05 23:40:11 crc kubenswrapper[4734]: I1205 23:40:11.376549 4734 generic.go:334] "Generic (PLEG): container finished" podID="b2bc9278-e755-43b5-8fb5-91854e437360" containerID="ff2091829a05a30ccb88eb7084e7999fd774c9be43e57361cdcf75d73a9f7e1f" exitCode=0 Dec 05 23:40:11 crc kubenswrapper[4734]: I1205 23:40:11.376639 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3f13-account-create-update-p4zg7" event={"ID":"b2bc9278-e755-43b5-8fb5-91854e437360","Type":"ContainerDied","Data":"ff2091829a05a30ccb88eb7084e7999fd774c9be43e57361cdcf75d73a9f7e1f"} Dec 05 23:40:11 crc kubenswrapper[4734]: I1205 23:40:11.379982 4734 generic.go:334] "Generic (PLEG): container finished" podID="caa5035d-868a-4c3a-bb3e-43f7b84096f4" containerID="280dcb46d2d5e0e0ad693ab9fa11a2b80bd6d54132b3847713805478cdafa687" exitCode=0 Dec 05 23:40:11 crc kubenswrapper[4734]: I1205 23:40:11.380050 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qrbf6" event={"ID":"caa5035d-868a-4c3a-bb3e-43f7b84096f4","Type":"ContainerDied","Data":"280dcb46d2d5e0e0ad693ab9fa11a2b80bd6d54132b3847713805478cdafa687"} Dec 05 23:40:11 crc kubenswrapper[4734]: I1205 23:40:11.382281 4734 generic.go:334] "Generic (PLEG): container finished" podID="1148161b-fe59-434d-880a-80a03b0c8ff7" containerID="6ef81526408489648dfd43063f64a53e2dda7f0bef3c2b9ccc2203d03038925f" exitCode=0 Dec 05 23:40:11 crc kubenswrapper[4734]: I1205 23:40:11.382428 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-53ed-account-create-update-8tnq8" event={"ID":"1148161b-fe59-434d-880a-80a03b0c8ff7","Type":"ContainerDied","Data":"6ef81526408489648dfd43063f64a53e2dda7f0bef3c2b9ccc2203d03038925f"} Dec 05 23:40:11 crc kubenswrapper[4734]: I1205 23:40:11.384822 4734 generic.go:334] "Generic (PLEG): container finished" podID="52e79363-534e-4d0c-9cdf-86ad75fa19bb" containerID="285fa850414a64aee6539c2bf6a5ce85580800eb5ed4b04238c6235b47f95167" exitCode=0 Dec 05 23:40:11 crc kubenswrapper[4734]: I1205 23:40:11.384860 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d01a-account-create-update-4smb4" event={"ID":"52e79363-534e-4d0c-9cdf-86ad75fa19bb","Type":"ContainerDied","Data":"285fa850414a64aee6539c2bf6a5ce85580800eb5ed4b04238c6235b47f95167"} Dec 05 23:40:11 crc kubenswrapper[4734]: I1205 23:40:11.887887 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 23:40:11 crc kubenswrapper[4734]: I1205 23:40:11.889863 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 23:40:11 crc kubenswrapper[4734]: I1205 23:40:11.931897 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4fmf5" Dec 05 23:40:11 crc kubenswrapper[4734]: I1205 23:40:11.933588 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 23:40:11 crc kubenswrapper[4734]: I1205 23:40:11.944110 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jdhd2" Dec 05 23:40:11 crc kubenswrapper[4734]: I1205 23:40:11.947896 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 23:40:11 crc kubenswrapper[4734]: I1205 23:40:11.981578 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 23:40:11 crc kubenswrapper[4734]: I1205 23:40:11.981719 4734 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 23:40:12 crc kubenswrapper[4734]: I1205 23:40:12.063491 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 23:40:12 crc kubenswrapper[4734]: I1205 23:40:12.132847 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm2jq\" (UniqueName: \"kubernetes.io/projected/b88ba6a0-1e12-4bd6-a483-b2522fad58f9-kube-api-access-lm2jq\") pod \"b88ba6a0-1e12-4bd6-a483-b2522fad58f9\" (UID: \"b88ba6a0-1e12-4bd6-a483-b2522fad58f9\") " Dec 05 23:40:12 crc kubenswrapper[4734]: I1205 23:40:12.132934 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4w2k\" (UniqueName: \"kubernetes.io/projected/e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0-kube-api-access-p4w2k\") pod \"e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0\" (UID: \"e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0\") " Dec 05 23:40:12 crc kubenswrapper[4734]: I1205 23:40:12.133082 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b88ba6a0-1e12-4bd6-a483-b2522fad58f9-operator-scripts\") pod \"b88ba6a0-1e12-4bd6-a483-b2522fad58f9\" (UID: \"b88ba6a0-1e12-4bd6-a483-b2522fad58f9\") " Dec 05 23:40:12 crc kubenswrapper[4734]: I1205 23:40:12.133249 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0-operator-scripts\") pod \"e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0\" (UID: \"e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0\") " Dec 05 23:40:12 crc kubenswrapper[4734]: I1205 23:40:12.135624 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b88ba6a0-1e12-4bd6-a483-b2522fad58f9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b88ba6a0-1e12-4bd6-a483-b2522fad58f9" (UID: "b88ba6a0-1e12-4bd6-a483-b2522fad58f9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:40:12 crc kubenswrapper[4734]: I1205 23:40:12.135817 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0" (UID: "e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:40:12 crc kubenswrapper[4734]: I1205 23:40:12.143772 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0-kube-api-access-p4w2k" (OuterVolumeSpecName: "kube-api-access-p4w2k") pod "e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0" (UID: "e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0"). InnerVolumeSpecName "kube-api-access-p4w2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:40:12 crc kubenswrapper[4734]: I1205 23:40:12.160840 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b88ba6a0-1e12-4bd6-a483-b2522fad58f9-kube-api-access-lm2jq" (OuterVolumeSpecName: "kube-api-access-lm2jq") pod "b88ba6a0-1e12-4bd6-a483-b2522fad58f9" (UID: "b88ba6a0-1e12-4bd6-a483-b2522fad58f9"). InnerVolumeSpecName "kube-api-access-lm2jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:40:12 crc kubenswrapper[4734]: I1205 23:40:12.235612 4734 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b88ba6a0-1e12-4bd6-a483-b2522fad58f9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:12 crc kubenswrapper[4734]: I1205 23:40:12.235654 4734 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:12 crc kubenswrapper[4734]: I1205 23:40:12.235667 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lm2jq\" (UniqueName: \"kubernetes.io/projected/b88ba6a0-1e12-4bd6-a483-b2522fad58f9-kube-api-access-lm2jq\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:12 crc kubenswrapper[4734]: I1205 23:40:12.235681 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4w2k\" (UniqueName: \"kubernetes.io/projected/e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0-kube-api-access-p4w2k\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:12 crc kubenswrapper[4734]: I1205 23:40:12.396657 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jdhd2" event={"ID":"e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0","Type":"ContainerDied","Data":"e852baefc08b26b38557db0b52b569f65d4229a03870bb9c8f0f53e0d01b811d"} Dec 05 23:40:12 crc kubenswrapper[4734]: I1205 23:40:12.396704 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e852baefc08b26b38557db0b52b569f65d4229a03870bb9c8f0f53e0d01b811d" Dec 05 23:40:12 crc kubenswrapper[4734]: I1205 23:40:12.396760 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jdhd2" Dec 05 23:40:12 crc kubenswrapper[4734]: I1205 23:40:12.405726 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4fmf5" event={"ID":"b88ba6a0-1e12-4bd6-a483-b2522fad58f9","Type":"ContainerDied","Data":"305ea6c9b2ebf1165bb8e1242bf6c440d42b6412f50149f842b6d1bcb77994bb"} Dec 05 23:40:12 crc kubenswrapper[4734]: I1205 23:40:12.405779 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="305ea6c9b2ebf1165bb8e1242bf6c440d42b6412f50149f842b6d1bcb77994bb" Dec 05 23:40:12 crc kubenswrapper[4734]: I1205 23:40:12.405782 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4fmf5" Dec 05 23:40:12 crc kubenswrapper[4734]: I1205 23:40:12.406813 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 23:40:12 crc kubenswrapper[4734]: I1205 23:40:12.406850 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 23:40:12 crc kubenswrapper[4734]: I1205 23:40:12.897335 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d01a-account-create-update-4smb4" Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.056497 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f57pd\" (UniqueName: \"kubernetes.io/projected/52e79363-534e-4d0c-9cdf-86ad75fa19bb-kube-api-access-f57pd\") pod \"52e79363-534e-4d0c-9cdf-86ad75fa19bb\" (UID: \"52e79363-534e-4d0c-9cdf-86ad75fa19bb\") " Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.056786 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52e79363-534e-4d0c-9cdf-86ad75fa19bb-operator-scripts\") pod \"52e79363-534e-4d0c-9cdf-86ad75fa19bb\" (UID: \"52e79363-534e-4d0c-9cdf-86ad75fa19bb\") " Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.058447 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52e79363-534e-4d0c-9cdf-86ad75fa19bb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "52e79363-534e-4d0c-9cdf-86ad75fa19bb" (UID: "52e79363-534e-4d0c-9cdf-86ad75fa19bb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.066496 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52e79363-534e-4d0c-9cdf-86ad75fa19bb-kube-api-access-f57pd" (OuterVolumeSpecName: "kube-api-access-f57pd") pod "52e79363-534e-4d0c-9cdf-86ad75fa19bb" (UID: "52e79363-534e-4d0c-9cdf-86ad75fa19bb"). InnerVolumeSpecName "kube-api-access-f57pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.115192 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qrbf6" Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.138065 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3f13-account-create-update-p4zg7" Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.147794 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-53ed-account-create-update-8tnq8" Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.160314 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f57pd\" (UniqueName: \"kubernetes.io/projected/52e79363-534e-4d0c-9cdf-86ad75fa19bb-kube-api-access-f57pd\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.160375 4734 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52e79363-534e-4d0c-9cdf-86ad75fa19bb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.261963 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2bc9278-e755-43b5-8fb5-91854e437360-operator-scripts\") pod \"b2bc9278-e755-43b5-8fb5-91854e437360\" (UID: \"b2bc9278-e755-43b5-8fb5-91854e437360\") " Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.262039 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8v7w\" (UniqueName: \"kubernetes.io/projected/caa5035d-868a-4c3a-bb3e-43f7b84096f4-kube-api-access-q8v7w\") pod \"caa5035d-868a-4c3a-bb3e-43f7b84096f4\" (UID: \"caa5035d-868a-4c3a-bb3e-43f7b84096f4\") " Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.262114 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1148161b-fe59-434d-880a-80a03b0c8ff7-operator-scripts\") pod \"1148161b-fe59-434d-880a-80a03b0c8ff7\" (UID: \"1148161b-fe59-434d-880a-80a03b0c8ff7\") " Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.262170 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9blpj\" (UniqueName: \"kubernetes.io/projected/b2bc9278-e755-43b5-8fb5-91854e437360-kube-api-access-9blpj\") pod \"b2bc9278-e755-43b5-8fb5-91854e437360\" (UID: \"b2bc9278-e755-43b5-8fb5-91854e437360\") " Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.262279 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caa5035d-868a-4c3a-bb3e-43f7b84096f4-operator-scripts\") pod \"caa5035d-868a-4c3a-bb3e-43f7b84096f4\" (UID: \"caa5035d-868a-4c3a-bb3e-43f7b84096f4\") " Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.262307 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9rmr\" (UniqueName: \"kubernetes.io/projected/1148161b-fe59-434d-880a-80a03b0c8ff7-kube-api-access-c9rmr\") pod \"1148161b-fe59-434d-880a-80a03b0c8ff7\" (UID: \"1148161b-fe59-434d-880a-80a03b0c8ff7\") " Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.262565 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2bc9278-e755-43b5-8fb5-91854e437360-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b2bc9278-e755-43b5-8fb5-91854e437360" (UID: "b2bc9278-e755-43b5-8fb5-91854e437360"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.262985 4734 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2bc9278-e755-43b5-8fb5-91854e437360-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.263106 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caa5035d-868a-4c3a-bb3e-43f7b84096f4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "caa5035d-868a-4c3a-bb3e-43f7b84096f4" (UID: "caa5035d-868a-4c3a-bb3e-43f7b84096f4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.263371 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1148161b-fe59-434d-880a-80a03b0c8ff7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1148161b-fe59-434d-880a-80a03b0c8ff7" (UID: "1148161b-fe59-434d-880a-80a03b0c8ff7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.266684 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caa5035d-868a-4c3a-bb3e-43f7b84096f4-kube-api-access-q8v7w" (OuterVolumeSpecName: "kube-api-access-q8v7w") pod "caa5035d-868a-4c3a-bb3e-43f7b84096f4" (UID: "caa5035d-868a-4c3a-bb3e-43f7b84096f4"). InnerVolumeSpecName "kube-api-access-q8v7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.267324 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1148161b-fe59-434d-880a-80a03b0c8ff7-kube-api-access-c9rmr" (OuterVolumeSpecName: "kube-api-access-c9rmr") pod "1148161b-fe59-434d-880a-80a03b0c8ff7" (UID: "1148161b-fe59-434d-880a-80a03b0c8ff7"). InnerVolumeSpecName "kube-api-access-c9rmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.268695 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2bc9278-e755-43b5-8fb5-91854e437360-kube-api-access-9blpj" (OuterVolumeSpecName: "kube-api-access-9blpj") pod "b2bc9278-e755-43b5-8fb5-91854e437360" (UID: "b2bc9278-e755-43b5-8fb5-91854e437360"). InnerVolumeSpecName "kube-api-access-9blpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.365085 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8v7w\" (UniqueName: \"kubernetes.io/projected/caa5035d-868a-4c3a-bb3e-43f7b84096f4-kube-api-access-q8v7w\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.365431 4734 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1148161b-fe59-434d-880a-80a03b0c8ff7-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.365504 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9blpj\" (UniqueName: \"kubernetes.io/projected/b2bc9278-e755-43b5-8fb5-91854e437360-kube-api-access-9blpj\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.365588 4734 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caa5035d-868a-4c3a-bb3e-43f7b84096f4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.365657 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9rmr\" (UniqueName: \"kubernetes.io/projected/1148161b-fe59-434d-880a-80a03b0c8ff7-kube-api-access-c9rmr\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.418998 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qrbf6" event={"ID":"caa5035d-868a-4c3a-bb3e-43f7b84096f4","Type":"ContainerDied","Data":"c3b7c1a85483199bc2a55bf39975b9a54b177b732d4c8fffea23530ddd697623"} Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.419087 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3b7c1a85483199bc2a55bf39975b9a54b177b732d4c8fffea23530ddd697623" Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.419014 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qrbf6" Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.420998 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-53ed-account-create-update-8tnq8" event={"ID":"1148161b-fe59-434d-880a-80a03b0c8ff7","Type":"ContainerDied","Data":"c538f61c2307f43d436e38c276e95067365019a48fc87249cb5c631b420c3ff8"} Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.421177 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c538f61c2307f43d436e38c276e95067365019a48fc87249cb5c631b420c3ff8" Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.421010 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-53ed-account-create-update-8tnq8" Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.422702 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d01a-account-create-update-4smb4" Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.422724 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d01a-account-create-update-4smb4" event={"ID":"52e79363-534e-4d0c-9cdf-86ad75fa19bb","Type":"ContainerDied","Data":"b5fa8e702327ac8104b026a37daec74b6de61880d0a21c519f6afc9c0dc13f7b"} Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.422751 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5fa8e702327ac8104b026a37daec74b6de61880d0a21c519f6afc9c0dc13f7b" Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.424886 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3f13-account-create-update-p4zg7" Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.428637 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3f13-account-create-update-p4zg7" event={"ID":"b2bc9278-e755-43b5-8fb5-91854e437360","Type":"ContainerDied","Data":"ee1b989327170480c047c175ff77bc9a7168f72c06c352f1f8f5ac50fd9389a3"} Dec 05 23:40:13 crc kubenswrapper[4734]: I1205 23:40:13.428662 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee1b989327170480c047c175ff77bc9a7168f72c06c352f1f8f5ac50fd9389a3" Dec 05 23:40:14 crc kubenswrapper[4734]: I1205 23:40:14.957079 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 23:40:14 crc kubenswrapper[4734]: I1205 23:40:14.957256 4734 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 23:40:15 crc kubenswrapper[4734]: I1205 23:40:15.044726 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 23:40:18 crc kubenswrapper[4734]: I1205 23:40:18.835439 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jcths"] Dec 05 23:40:18 crc kubenswrapper[4734]: E1205 23:40:18.853847 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1148161b-fe59-434d-880a-80a03b0c8ff7" containerName="mariadb-account-create-update" Dec 05 23:40:18 crc kubenswrapper[4734]: I1205 23:40:18.853903 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="1148161b-fe59-434d-880a-80a03b0c8ff7" containerName="mariadb-account-create-update" Dec 05 23:40:18 crc kubenswrapper[4734]: E1205 23:40:18.853939 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2bc9278-e755-43b5-8fb5-91854e437360" containerName="mariadb-account-create-update" Dec 05 23:40:18 crc kubenswrapper[4734]: I1205 23:40:18.853946 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2bc9278-e755-43b5-8fb5-91854e437360" containerName="mariadb-account-create-update" Dec 05 23:40:18 crc kubenswrapper[4734]: E1205 23:40:18.853972 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b88ba6a0-1e12-4bd6-a483-b2522fad58f9" containerName="mariadb-database-create" Dec 05 23:40:18 crc kubenswrapper[4734]: I1205 23:40:18.853980 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88ba6a0-1e12-4bd6-a483-b2522fad58f9" containerName="mariadb-database-create" Dec 05 23:40:18 crc kubenswrapper[4734]: E1205 23:40:18.854005 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52e79363-534e-4d0c-9cdf-86ad75fa19bb" containerName="mariadb-account-create-update" Dec 05 23:40:18 crc kubenswrapper[4734]: I1205 23:40:18.854011 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="52e79363-534e-4d0c-9cdf-86ad75fa19bb" containerName="mariadb-account-create-update" Dec 05 23:40:18 crc kubenswrapper[4734]: E1205 23:40:18.854029 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0" containerName="mariadb-database-create" Dec 05 23:40:18 crc kubenswrapper[4734]: I1205 23:40:18.854035 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0" containerName="mariadb-database-create" Dec 05 23:40:18 crc kubenswrapper[4734]: E1205 23:40:18.854073 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caa5035d-868a-4c3a-bb3e-43f7b84096f4" containerName="mariadb-database-create" Dec 05 23:40:18 crc kubenswrapper[4734]: I1205 23:40:18.854080 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="caa5035d-868a-4c3a-bb3e-43f7b84096f4" containerName="mariadb-database-create" Dec 05 23:40:18 crc kubenswrapper[4734]: I1205 23:40:18.854528 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="1148161b-fe59-434d-880a-80a03b0c8ff7" containerName="mariadb-account-create-update" Dec 05 23:40:18 crc kubenswrapper[4734]: I1205 23:40:18.854575 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0" containerName="mariadb-database-create" Dec 05 23:40:18 crc kubenswrapper[4734]: I1205 23:40:18.854599 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="b88ba6a0-1e12-4bd6-a483-b2522fad58f9" containerName="mariadb-database-create" Dec 05 23:40:18 crc kubenswrapper[4734]: I1205 23:40:18.854618 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2bc9278-e755-43b5-8fb5-91854e437360" containerName="mariadb-account-create-update" Dec 05 23:40:18 crc kubenswrapper[4734]: I1205 23:40:18.854629 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="caa5035d-868a-4c3a-bb3e-43f7b84096f4" containerName="mariadb-database-create" Dec 05 23:40:18 crc kubenswrapper[4734]: I1205 23:40:18.854645 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="52e79363-534e-4d0c-9cdf-86ad75fa19bb" containerName="mariadb-account-create-update" Dec 05 23:40:18 crc kubenswrapper[4734]: I1205 23:40:18.855775 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jcths" Dec 05 23:40:18 crc kubenswrapper[4734]: I1205 23:40:18.872059 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 05 23:40:18 crc kubenswrapper[4734]: I1205 23:40:18.872458 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 05 23:40:18 crc kubenswrapper[4734]: I1205 23:40:18.872469 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-5mptx" Dec 05 23:40:18 crc kubenswrapper[4734]: I1205 23:40:18.895500 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jcths"] Dec 05 23:40:19 crc kubenswrapper[4734]: I1205 23:40:19.003608 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/017da56d-32a5-42b2-91c5-efc5fc6480c3-config-data\") pod \"nova-cell0-conductor-db-sync-jcths\" (UID: \"017da56d-32a5-42b2-91c5-efc5fc6480c3\") " pod="openstack/nova-cell0-conductor-db-sync-jcths" Dec 05 23:40:19 crc kubenswrapper[4734]: I1205 23:40:19.003697 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/017da56d-32a5-42b2-91c5-efc5fc6480c3-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jcths\" (UID: \"017da56d-32a5-42b2-91c5-efc5fc6480c3\") " pod="openstack/nova-cell0-conductor-db-sync-jcths" Dec 05 23:40:19 crc kubenswrapper[4734]: I1205 23:40:19.003723 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/017da56d-32a5-42b2-91c5-efc5fc6480c3-scripts\") pod \"nova-cell0-conductor-db-sync-jcths\" (UID: \"017da56d-32a5-42b2-91c5-efc5fc6480c3\") " pod="openstack/nova-cell0-conductor-db-sync-jcths" Dec 05 23:40:19 crc kubenswrapper[4734]: I1205 23:40:19.003785 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsdb5\" (UniqueName: \"kubernetes.io/projected/017da56d-32a5-42b2-91c5-efc5fc6480c3-kube-api-access-rsdb5\") pod \"nova-cell0-conductor-db-sync-jcths\" (UID: \"017da56d-32a5-42b2-91c5-efc5fc6480c3\") " pod="openstack/nova-cell0-conductor-db-sync-jcths" Dec 05 23:40:19 crc kubenswrapper[4734]: I1205 23:40:19.106188 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsdb5\" (UniqueName: \"kubernetes.io/projected/017da56d-32a5-42b2-91c5-efc5fc6480c3-kube-api-access-rsdb5\") pod \"nova-cell0-conductor-db-sync-jcths\" (UID: \"017da56d-32a5-42b2-91c5-efc5fc6480c3\") " pod="openstack/nova-cell0-conductor-db-sync-jcths" Dec 05 23:40:19 crc kubenswrapper[4734]: I1205 23:40:19.106362 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/017da56d-32a5-42b2-91c5-efc5fc6480c3-config-data\") pod \"nova-cell0-conductor-db-sync-jcths\" (UID: \"017da56d-32a5-42b2-91c5-efc5fc6480c3\") " pod="openstack/nova-cell0-conductor-db-sync-jcths" Dec 05 23:40:19 crc kubenswrapper[4734]: I1205 23:40:19.106402 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/017da56d-32a5-42b2-91c5-efc5fc6480c3-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jcths\" (UID: \"017da56d-32a5-42b2-91c5-efc5fc6480c3\") " pod="openstack/nova-cell0-conductor-db-sync-jcths" Dec 05 23:40:19 crc kubenswrapper[4734]: I1205 23:40:19.106428 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/017da56d-32a5-42b2-91c5-efc5fc6480c3-scripts\") pod \"nova-cell0-conductor-db-sync-jcths\" (UID: \"017da56d-32a5-42b2-91c5-efc5fc6480c3\") " pod="openstack/nova-cell0-conductor-db-sync-jcths" Dec 05 23:40:19 crc kubenswrapper[4734]: I1205 23:40:19.116043 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/017da56d-32a5-42b2-91c5-efc5fc6480c3-scripts\") pod \"nova-cell0-conductor-db-sync-jcths\" (UID: \"017da56d-32a5-42b2-91c5-efc5fc6480c3\") " pod="openstack/nova-cell0-conductor-db-sync-jcths" Dec 05 23:40:19 crc kubenswrapper[4734]: I1205 23:40:19.116072 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/017da56d-32a5-42b2-91c5-efc5fc6480c3-config-data\") pod \"nova-cell0-conductor-db-sync-jcths\" (UID: \"017da56d-32a5-42b2-91c5-efc5fc6480c3\") " pod="openstack/nova-cell0-conductor-db-sync-jcths" Dec 05 23:40:19 crc kubenswrapper[4734]: I1205 23:40:19.121805 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/017da56d-32a5-42b2-91c5-efc5fc6480c3-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jcths\" (UID: \"017da56d-32a5-42b2-91c5-efc5fc6480c3\") " pod="openstack/nova-cell0-conductor-db-sync-jcths" Dec 05 23:40:19 crc kubenswrapper[4734]: I1205 23:40:19.128979 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsdb5\" (UniqueName: \"kubernetes.io/projected/017da56d-32a5-42b2-91c5-efc5fc6480c3-kube-api-access-rsdb5\") pod \"nova-cell0-conductor-db-sync-jcths\" (UID: \"017da56d-32a5-42b2-91c5-efc5fc6480c3\") " pod="openstack/nova-cell0-conductor-db-sync-jcths" Dec 05 23:40:19 crc kubenswrapper[4734]: I1205 23:40:19.206401 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jcths" Dec 05 23:40:19 crc kubenswrapper[4734]: I1205 23:40:19.715407 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jcths"] Dec 05 23:40:19 crc kubenswrapper[4734]: W1205 23:40:19.733331 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod017da56d_32a5_42b2_91c5_efc5fc6480c3.slice/crio-2631e16a90762dcb5ce5cf8ae8b17c0bf04c6f6541bda73a0a0a0f3228ee0b33 WatchSource:0}: Error finding container 2631e16a90762dcb5ce5cf8ae8b17c0bf04c6f6541bda73a0a0a0f3228ee0b33: Status 404 returned error can't find the container with id 2631e16a90762dcb5ce5cf8ae8b17c0bf04c6f6541bda73a0a0a0f3228ee0b33 Dec 05 23:40:20 crc kubenswrapper[4734]: I1205 23:40:20.502760 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jcths" event={"ID":"017da56d-32a5-42b2-91c5-efc5fc6480c3","Type":"ContainerStarted","Data":"2631e16a90762dcb5ce5cf8ae8b17c0bf04c6f6541bda73a0a0a0f3228ee0b33"} Dec 05 23:40:23 crc kubenswrapper[4734]: I1205 23:40:23.652842 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.533082 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.608108 4734 generic.go:334] "Generic (PLEG): container finished" podID="f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" containerID="eee0ce98affad252808be83570075bdf43423788a61427e6d7b3f43a9e5f797f" exitCode=137 Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.608209 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350","Type":"ContainerDied","Data":"eee0ce98affad252808be83570075bdf43423788a61427e6d7b3f43a9e5f797f"} Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.608271 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350","Type":"ContainerDied","Data":"169d9790764ca94a7080c24a467cfd5e18b0a92f7159da00c2b7158dbcc6390b"} Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.608295 4734 scope.go:117] "RemoveContainer" containerID="eee0ce98affad252808be83570075bdf43423788a61427e6d7b3f43a9e5f797f" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.608483 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.621809 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jcths" event={"ID":"017da56d-32a5-42b2-91c5-efc5fc6480c3","Type":"ContainerStarted","Data":"be080adf0c7e472e9d3f2c20e97eab322e3127ee9529e220c989ef5ee124359d"} Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.644839 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-jcths" podStartSLOduration=3.050200531 podStartE2EDuration="10.644816199s" podCreationTimestamp="2025-12-05 23:40:18 +0000 UTC" firstStartedPulling="2025-12-05 23:40:19.735676713 +0000 UTC m=+1240.419080989" lastFinishedPulling="2025-12-05 23:40:27.330292381 +0000 UTC m=+1248.013696657" observedRunningTime="2025-12-05 23:40:28.643327833 +0000 UTC m=+1249.326732109" watchObservedRunningTime="2025-12-05 23:40:28.644816199 +0000 UTC m=+1249.328220475" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.646125 4734 scope.go:117] "RemoveContainer" containerID="d28f8c59e2b624118f759e364781e1afa8a6cdbbf4a691768c964f62c6637a01" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.646161 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqsnf\" (UniqueName: \"kubernetes.io/projected/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-kube-api-access-qqsnf\") pod \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.646201 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-run-httpd\") pod \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.646326 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-combined-ca-bundle\") pod \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.647441 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-ceilometer-tls-certs\") pod \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.647211 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" (UID: "f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.647568 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-config-data\") pod \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.647649 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-sg-core-conf-yaml\") pod \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.647684 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-log-httpd\") pod \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.647779 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-scripts\") pod \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\" (UID: \"f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350\") " Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.648324 4734 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.648708 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" (UID: "f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.676090 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-kube-api-access-qqsnf" (OuterVolumeSpecName: "kube-api-access-qqsnf") pod "f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" (UID: "f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350"). InnerVolumeSpecName "kube-api-access-qqsnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.676336 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-scripts" (OuterVolumeSpecName: "scripts") pod "f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" (UID: "f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.686474 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" (UID: "f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.727130 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" (UID: "f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.750767 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqsnf\" (UniqueName: \"kubernetes.io/projected/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-kube-api-access-qqsnf\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.750807 4734 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.750820 4734 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.750835 4734 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.750849 4734 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.760845 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" (UID: "f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.791650 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-config-data" (OuterVolumeSpecName: "config-data") pod "f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" (UID: "f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.853320 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.853366 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.853646 4734 scope.go:117] "RemoveContainer" containerID="908819d566bd6792c4502a5e6fba71c14d9e6bfa7c5c91dba03ca94e9316d98f" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.878111 4734 scope.go:117] "RemoveContainer" containerID="30302b4abf366dfba310850d0125358565c63a7dfc4569e6a155078856267070" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.902512 4734 scope.go:117] "RemoveContainer" containerID="eee0ce98affad252808be83570075bdf43423788a61427e6d7b3f43a9e5f797f" Dec 05 23:40:28 crc kubenswrapper[4734]: E1205 23:40:28.903339 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eee0ce98affad252808be83570075bdf43423788a61427e6d7b3f43a9e5f797f\": container with ID starting with eee0ce98affad252808be83570075bdf43423788a61427e6d7b3f43a9e5f797f not found: ID does not exist" containerID="eee0ce98affad252808be83570075bdf43423788a61427e6d7b3f43a9e5f797f" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.903407 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eee0ce98affad252808be83570075bdf43423788a61427e6d7b3f43a9e5f797f"} err="failed to get container status \"eee0ce98affad252808be83570075bdf43423788a61427e6d7b3f43a9e5f797f\": rpc error: code = NotFound desc = could not find container \"eee0ce98affad252808be83570075bdf43423788a61427e6d7b3f43a9e5f797f\": container with ID starting with eee0ce98affad252808be83570075bdf43423788a61427e6d7b3f43a9e5f797f not found: ID does not exist" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.903447 4734 scope.go:117] "RemoveContainer" containerID="d28f8c59e2b624118f759e364781e1afa8a6cdbbf4a691768c964f62c6637a01" Dec 05 23:40:28 crc kubenswrapper[4734]: E1205 23:40:28.904305 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d28f8c59e2b624118f759e364781e1afa8a6cdbbf4a691768c964f62c6637a01\": container with ID starting with d28f8c59e2b624118f759e364781e1afa8a6cdbbf4a691768c964f62c6637a01 not found: ID does not exist" containerID="d28f8c59e2b624118f759e364781e1afa8a6cdbbf4a691768c964f62c6637a01" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.904378 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d28f8c59e2b624118f759e364781e1afa8a6cdbbf4a691768c964f62c6637a01"} err="failed to get container status \"d28f8c59e2b624118f759e364781e1afa8a6cdbbf4a691768c964f62c6637a01\": rpc error: code = NotFound desc = could not find container \"d28f8c59e2b624118f759e364781e1afa8a6cdbbf4a691768c964f62c6637a01\": container with ID starting with d28f8c59e2b624118f759e364781e1afa8a6cdbbf4a691768c964f62c6637a01 not found: ID does not exist" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.904416 4734 scope.go:117] "RemoveContainer" containerID="908819d566bd6792c4502a5e6fba71c14d9e6bfa7c5c91dba03ca94e9316d98f" Dec 05 23:40:28 crc kubenswrapper[4734]: E1205 23:40:28.904897 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"908819d566bd6792c4502a5e6fba71c14d9e6bfa7c5c91dba03ca94e9316d98f\": container with ID starting with 908819d566bd6792c4502a5e6fba71c14d9e6bfa7c5c91dba03ca94e9316d98f not found: ID does not exist" containerID="908819d566bd6792c4502a5e6fba71c14d9e6bfa7c5c91dba03ca94e9316d98f" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.904930 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"908819d566bd6792c4502a5e6fba71c14d9e6bfa7c5c91dba03ca94e9316d98f"} err="failed to get container status \"908819d566bd6792c4502a5e6fba71c14d9e6bfa7c5c91dba03ca94e9316d98f\": rpc error: code = NotFound desc = could not find container \"908819d566bd6792c4502a5e6fba71c14d9e6bfa7c5c91dba03ca94e9316d98f\": container with ID starting with 908819d566bd6792c4502a5e6fba71c14d9e6bfa7c5c91dba03ca94e9316d98f not found: ID does not exist" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.904948 4734 scope.go:117] "RemoveContainer" containerID="30302b4abf366dfba310850d0125358565c63a7dfc4569e6a155078856267070" Dec 05 23:40:28 crc kubenswrapper[4734]: E1205 23:40:28.905664 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30302b4abf366dfba310850d0125358565c63a7dfc4569e6a155078856267070\": container with ID starting with 30302b4abf366dfba310850d0125358565c63a7dfc4569e6a155078856267070 not found: ID does not exist" containerID="30302b4abf366dfba310850d0125358565c63a7dfc4569e6a155078856267070" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.905708 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30302b4abf366dfba310850d0125358565c63a7dfc4569e6a155078856267070"} err="failed to get container status \"30302b4abf366dfba310850d0125358565c63a7dfc4569e6a155078856267070\": rpc error: code = NotFound desc = could not find container \"30302b4abf366dfba310850d0125358565c63a7dfc4569e6a155078856267070\": container with ID starting with 30302b4abf366dfba310850d0125358565c63a7dfc4569e6a155078856267070 not found: ID does not exist" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.950340 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.961420 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.975594 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:40:28 crc kubenswrapper[4734]: E1205 23:40:28.976122 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" containerName="sg-core" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.976145 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" containerName="sg-core" Dec 05 23:40:28 crc kubenswrapper[4734]: E1205 23:40:28.976179 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" containerName="ceilometer-notification-agent" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.976189 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" containerName="ceilometer-notification-agent" Dec 05 23:40:28 crc kubenswrapper[4734]: E1205 23:40:28.976215 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" containerName="proxy-httpd" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.976224 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" containerName="proxy-httpd" Dec 05 23:40:28 crc kubenswrapper[4734]: E1205 23:40:28.976247 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" containerName="ceilometer-central-agent" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.976258 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" containerName="ceilometer-central-agent" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.976453 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" containerName="proxy-httpd" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.976477 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" containerName="ceilometer-central-agent" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.976493 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" containerName="ceilometer-notification-agent" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.976504 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" containerName="sg-core" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.978255 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.982483 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.983023 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.984061 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 23:40:28 crc kubenswrapper[4734]: I1205 23:40:28.998042 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:40:29 crc kubenswrapper[4734]: I1205 23:40:29.056628 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-scripts\") pod \"ceilometer-0\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " pod="openstack/ceilometer-0" Dec 05 23:40:29 crc kubenswrapper[4734]: I1205 23:40:29.056684 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9w8x\" (UniqueName: \"kubernetes.io/projected/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-kube-api-access-s9w8x\") pod \"ceilometer-0\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " pod="openstack/ceilometer-0" Dec 05 23:40:29 crc kubenswrapper[4734]: I1205 23:40:29.056728 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " pod="openstack/ceilometer-0" Dec 05 23:40:29 crc kubenswrapper[4734]: I1205 23:40:29.056763 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-run-httpd\") pod \"ceilometer-0\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " pod="openstack/ceilometer-0" Dec 05 23:40:29 crc kubenswrapper[4734]: I1205 23:40:29.057372 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " pod="openstack/ceilometer-0" Dec 05 23:40:29 crc kubenswrapper[4734]: I1205 23:40:29.057456 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " pod="openstack/ceilometer-0" Dec 05 23:40:29 crc kubenswrapper[4734]: I1205 23:40:29.057484 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-log-httpd\") pod \"ceilometer-0\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " pod="openstack/ceilometer-0" Dec 05 23:40:29 crc kubenswrapper[4734]: I1205 23:40:29.057521 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-config-data\") pod \"ceilometer-0\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " pod="openstack/ceilometer-0" Dec 05 23:40:29 crc kubenswrapper[4734]: I1205 23:40:29.160030 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " pod="openstack/ceilometer-0" Dec 05 23:40:29 crc kubenswrapper[4734]: I1205 23:40:29.160121 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-log-httpd\") pod \"ceilometer-0\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " pod="openstack/ceilometer-0" Dec 05 23:40:29 crc kubenswrapper[4734]: I1205 23:40:29.160148 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " pod="openstack/ceilometer-0" Dec 05 23:40:29 crc kubenswrapper[4734]: I1205 23:40:29.160173 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-config-data\") pod \"ceilometer-0\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " pod="openstack/ceilometer-0" Dec 05 23:40:29 crc kubenswrapper[4734]: I1205 23:40:29.160216 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-scripts\") pod \"ceilometer-0\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " pod="openstack/ceilometer-0" Dec 05 23:40:29 crc kubenswrapper[4734]: I1205 23:40:29.160246 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9w8x\" (UniqueName: \"kubernetes.io/projected/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-kube-api-access-s9w8x\") pod \"ceilometer-0\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " pod="openstack/ceilometer-0" Dec 05 23:40:29 crc kubenswrapper[4734]: I1205 23:40:29.160868 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " pod="openstack/ceilometer-0" Dec 05 23:40:29 crc kubenswrapper[4734]: I1205 23:40:29.160915 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-run-httpd\") pod \"ceilometer-0\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " pod="openstack/ceilometer-0" Dec 05 23:40:29 crc kubenswrapper[4734]: I1205 23:40:29.161172 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-log-httpd\") pod \"ceilometer-0\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " pod="openstack/ceilometer-0" Dec 05 23:40:29 crc kubenswrapper[4734]: I1205 23:40:29.161326 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-run-httpd\") pod \"ceilometer-0\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " pod="openstack/ceilometer-0" Dec 05 23:40:29 crc kubenswrapper[4734]: I1205 23:40:29.165638 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " pod="openstack/ceilometer-0" Dec 05 23:40:29 crc kubenswrapper[4734]: I1205 23:40:29.165805 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " pod="openstack/ceilometer-0" Dec 05 23:40:29 crc kubenswrapper[4734]: I1205 23:40:29.167578 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-config-data\") pod \"ceilometer-0\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " pod="openstack/ceilometer-0" Dec 05 23:40:29 crc kubenswrapper[4734]: I1205 23:40:29.168725 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " pod="openstack/ceilometer-0" Dec 05 23:40:29 crc kubenswrapper[4734]: I1205 23:40:29.171265 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-scripts\") pod \"ceilometer-0\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " pod="openstack/ceilometer-0" Dec 05 23:40:29 crc kubenswrapper[4734]: I1205 23:40:29.181370 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9w8x\" (UniqueName: \"kubernetes.io/projected/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-kube-api-access-s9w8x\") pod \"ceilometer-0\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " pod="openstack/ceilometer-0" Dec 05 23:40:29 crc kubenswrapper[4734]: I1205 23:40:29.356448 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 23:40:29 crc kubenswrapper[4734]: I1205 23:40:29.639708 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350" path="/var/lib/kubelet/pods/f3ad5c53-eefe-4e3d-8ef9-0dabf1df4350/volumes" Dec 05 23:40:29 crc kubenswrapper[4734]: I1205 23:40:29.869079 4734 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 23:40:29 crc kubenswrapper[4734]: I1205 23:40:29.881321 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:40:30 crc kubenswrapper[4734]: I1205 23:40:30.660678 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e","Type":"ContainerStarted","Data":"c8c1e98a40bc0f96812dadcd4575b203ee7c79b647737eccd99e82658efaefa0"} Dec 05 23:40:30 crc kubenswrapper[4734]: I1205 23:40:30.661144 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e","Type":"ContainerStarted","Data":"3b194672658ec89132d26912cc69aabdbbace67b13ff27532e8d98de6ec4eade"} Dec 05 23:40:31 crc kubenswrapper[4734]: I1205 23:40:31.678710 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e","Type":"ContainerStarted","Data":"d4d1983b58b5522a54ba87ff322862b39b4506725b8a45764be4aa4dabad2e59"} Dec 05 23:40:32 crc kubenswrapper[4734]: I1205 23:40:32.693878 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e","Type":"ContainerStarted","Data":"85416fe942405d9f868ee60ff03406a7c963a8eb9655963149f5a25baf3ac145"} Dec 05 23:40:33 crc kubenswrapper[4734]: I1205 23:40:33.709504 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e","Type":"ContainerStarted","Data":"84ec79b03026fd94534cface387c534591afdd4ae3aec1de79fdaa597d333e19"} Dec 05 23:40:33 crc kubenswrapper[4734]: I1205 23:40:33.710395 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 23:40:33 crc kubenswrapper[4734]: I1205 23:40:33.751418 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.620399681 podStartE2EDuration="5.75138648s" podCreationTimestamp="2025-12-05 23:40:28 +0000 UTC" firstStartedPulling="2025-12-05 23:40:29.868780191 +0000 UTC m=+1250.552184467" lastFinishedPulling="2025-12-05 23:40:32.99976699 +0000 UTC m=+1253.683171266" observedRunningTime="2025-12-05 23:40:33.74479368 +0000 UTC m=+1254.428197946" watchObservedRunningTime="2025-12-05 23:40:33.75138648 +0000 UTC m=+1254.434790756" Dec 05 23:40:45 crc kubenswrapper[4734]: I1205 23:40:45.855151 4734 generic.go:334] "Generic (PLEG): container finished" podID="017da56d-32a5-42b2-91c5-efc5fc6480c3" containerID="be080adf0c7e472e9d3f2c20e97eab322e3127ee9529e220c989ef5ee124359d" exitCode=0 Dec 05 23:40:45 crc kubenswrapper[4734]: I1205 23:40:45.855275 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jcths" event={"ID":"017da56d-32a5-42b2-91c5-efc5fc6480c3","Type":"ContainerDied","Data":"be080adf0c7e472e9d3f2c20e97eab322e3127ee9529e220c989ef5ee124359d"} Dec 05 23:40:47 crc kubenswrapper[4734]: I1205 23:40:47.242268 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jcths" Dec 05 23:40:47 crc kubenswrapper[4734]: I1205 23:40:47.324033 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsdb5\" (UniqueName: \"kubernetes.io/projected/017da56d-32a5-42b2-91c5-efc5fc6480c3-kube-api-access-rsdb5\") pod \"017da56d-32a5-42b2-91c5-efc5fc6480c3\" (UID: \"017da56d-32a5-42b2-91c5-efc5fc6480c3\") " Dec 05 23:40:47 crc kubenswrapper[4734]: I1205 23:40:47.324381 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/017da56d-32a5-42b2-91c5-efc5fc6480c3-combined-ca-bundle\") pod \"017da56d-32a5-42b2-91c5-efc5fc6480c3\" (UID: \"017da56d-32a5-42b2-91c5-efc5fc6480c3\") " Dec 05 23:40:47 crc kubenswrapper[4734]: I1205 23:40:47.324459 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/017da56d-32a5-42b2-91c5-efc5fc6480c3-scripts\") pod \"017da56d-32a5-42b2-91c5-efc5fc6480c3\" (UID: \"017da56d-32a5-42b2-91c5-efc5fc6480c3\") " Dec 05 23:40:47 crc kubenswrapper[4734]: I1205 23:40:47.325080 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/017da56d-32a5-42b2-91c5-efc5fc6480c3-config-data\") pod \"017da56d-32a5-42b2-91c5-efc5fc6480c3\" (UID: \"017da56d-32a5-42b2-91c5-efc5fc6480c3\") " Dec 05 23:40:47 crc kubenswrapper[4734]: I1205 23:40:47.335044 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/017da56d-32a5-42b2-91c5-efc5fc6480c3-kube-api-access-rsdb5" (OuterVolumeSpecName: "kube-api-access-rsdb5") pod "017da56d-32a5-42b2-91c5-efc5fc6480c3" (UID: "017da56d-32a5-42b2-91c5-efc5fc6480c3"). InnerVolumeSpecName "kube-api-access-rsdb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:40:47 crc kubenswrapper[4734]: I1205 23:40:47.338680 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/017da56d-32a5-42b2-91c5-efc5fc6480c3-scripts" (OuterVolumeSpecName: "scripts") pod "017da56d-32a5-42b2-91c5-efc5fc6480c3" (UID: "017da56d-32a5-42b2-91c5-efc5fc6480c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:40:47 crc kubenswrapper[4734]: I1205 23:40:47.356769 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/017da56d-32a5-42b2-91c5-efc5fc6480c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "017da56d-32a5-42b2-91c5-efc5fc6480c3" (UID: "017da56d-32a5-42b2-91c5-efc5fc6480c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:40:47 crc kubenswrapper[4734]: I1205 23:40:47.359767 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/017da56d-32a5-42b2-91c5-efc5fc6480c3-config-data" (OuterVolumeSpecName: "config-data") pod "017da56d-32a5-42b2-91c5-efc5fc6480c3" (UID: "017da56d-32a5-42b2-91c5-efc5fc6480c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:40:47 crc kubenswrapper[4734]: I1205 23:40:47.428580 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsdb5\" (UniqueName: \"kubernetes.io/projected/017da56d-32a5-42b2-91c5-efc5fc6480c3-kube-api-access-rsdb5\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:47 crc kubenswrapper[4734]: I1205 23:40:47.428621 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/017da56d-32a5-42b2-91c5-efc5fc6480c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:47 crc kubenswrapper[4734]: I1205 23:40:47.428631 4734 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/017da56d-32a5-42b2-91c5-efc5fc6480c3-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:47 crc kubenswrapper[4734]: I1205 23:40:47.428641 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/017da56d-32a5-42b2-91c5-efc5fc6480c3-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:40:47 crc kubenswrapper[4734]: I1205 23:40:47.883493 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jcths" event={"ID":"017da56d-32a5-42b2-91c5-efc5fc6480c3","Type":"ContainerDied","Data":"2631e16a90762dcb5ce5cf8ae8b17c0bf04c6f6541bda73a0a0a0f3228ee0b33"} Dec 05 23:40:47 crc kubenswrapper[4734]: I1205 23:40:47.883589 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2631e16a90762dcb5ce5cf8ae8b17c0bf04c6f6541bda73a0a0a0f3228ee0b33" Dec 05 23:40:47 crc kubenswrapper[4734]: I1205 23:40:47.883601 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jcths" Dec 05 23:40:47 crc kubenswrapper[4734]: I1205 23:40:47.999731 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 23:40:48 crc kubenswrapper[4734]: E1205 23:40:48.000361 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="017da56d-32a5-42b2-91c5-efc5fc6480c3" containerName="nova-cell0-conductor-db-sync" Dec 05 23:40:48 crc kubenswrapper[4734]: I1205 23:40:48.000388 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="017da56d-32a5-42b2-91c5-efc5fc6480c3" containerName="nova-cell0-conductor-db-sync" Dec 05 23:40:48 crc kubenswrapper[4734]: I1205 23:40:48.000675 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="017da56d-32a5-42b2-91c5-efc5fc6480c3" containerName="nova-cell0-conductor-db-sync" Dec 05 23:40:48 crc kubenswrapper[4734]: I1205 23:40:48.001677 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 23:40:48 crc kubenswrapper[4734]: I1205 23:40:48.004938 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-5mptx" Dec 05 23:40:48 crc kubenswrapper[4734]: I1205 23:40:48.008221 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 05 23:40:48 crc kubenswrapper[4734]: I1205 23:40:48.016827 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 23:40:48 crc kubenswrapper[4734]: I1205 23:40:48.144234 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97634e74-2d01-49ae-b584-650725749027-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"97634e74-2d01-49ae-b584-650725749027\") " pod="openstack/nova-cell0-conductor-0" Dec 05 23:40:48 crc kubenswrapper[4734]: I1205 23:40:48.144459 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bfjp\" (UniqueName: \"kubernetes.io/projected/97634e74-2d01-49ae-b584-650725749027-kube-api-access-7bfjp\") pod \"nova-cell0-conductor-0\" (UID: \"97634e74-2d01-49ae-b584-650725749027\") " pod="openstack/nova-cell0-conductor-0" Dec 05 23:40:48 crc kubenswrapper[4734]: I1205 23:40:48.144543 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97634e74-2d01-49ae-b584-650725749027-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"97634e74-2d01-49ae-b584-650725749027\") " pod="openstack/nova-cell0-conductor-0" Dec 05 23:40:48 crc kubenswrapper[4734]: I1205 23:40:48.247301 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bfjp\" (UniqueName: \"kubernetes.io/projected/97634e74-2d01-49ae-b584-650725749027-kube-api-access-7bfjp\") pod \"nova-cell0-conductor-0\" (UID: \"97634e74-2d01-49ae-b584-650725749027\") " pod="openstack/nova-cell0-conductor-0" Dec 05 23:40:48 crc kubenswrapper[4734]: I1205 23:40:48.247445 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97634e74-2d01-49ae-b584-650725749027-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"97634e74-2d01-49ae-b584-650725749027\") " pod="openstack/nova-cell0-conductor-0" Dec 05 23:40:48 crc kubenswrapper[4734]: I1205 23:40:48.247606 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97634e74-2d01-49ae-b584-650725749027-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"97634e74-2d01-49ae-b584-650725749027\") " pod="openstack/nova-cell0-conductor-0" Dec 05 23:40:48 crc kubenswrapper[4734]: I1205 23:40:48.253953 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97634e74-2d01-49ae-b584-650725749027-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"97634e74-2d01-49ae-b584-650725749027\") " pod="openstack/nova-cell0-conductor-0" Dec 05 23:40:48 crc kubenswrapper[4734]: I1205 23:40:48.254073 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97634e74-2d01-49ae-b584-650725749027-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"97634e74-2d01-49ae-b584-650725749027\") " pod="openstack/nova-cell0-conductor-0" Dec 05 23:40:48 crc kubenswrapper[4734]: I1205 23:40:48.268150 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bfjp\" (UniqueName: \"kubernetes.io/projected/97634e74-2d01-49ae-b584-650725749027-kube-api-access-7bfjp\") pod \"nova-cell0-conductor-0\" (UID: \"97634e74-2d01-49ae-b584-650725749027\") " pod="openstack/nova-cell0-conductor-0" Dec 05 23:40:48 crc kubenswrapper[4734]: I1205 23:40:48.322553 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 23:40:48 crc kubenswrapper[4734]: I1205 23:40:48.812729 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 23:40:48 crc kubenswrapper[4734]: I1205 23:40:48.903971 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"97634e74-2d01-49ae-b584-650725749027","Type":"ContainerStarted","Data":"487f699b8a1ef8cf000f659de94196c4a82e3b41f2edcc2a8fdccd5a1f8586a4"} Dec 05 23:40:49 crc kubenswrapper[4734]: I1205 23:40:49.922997 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"97634e74-2d01-49ae-b584-650725749027","Type":"ContainerStarted","Data":"2c81df4aefce071c6c20d36c49dc4add2d06b105e5441c5a33c1d49a686c30f1"} Dec 05 23:40:49 crc kubenswrapper[4734]: I1205 23:40:49.925459 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 05 23:40:49 crc kubenswrapper[4734]: I1205 23:40:49.951623 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.951594178 podStartE2EDuration="2.951594178s" podCreationTimestamp="2025-12-05 23:40:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:40:49.940376086 +0000 UTC m=+1270.623780382" watchObservedRunningTime="2025-12-05 23:40:49.951594178 +0000 UTC m=+1270.634998454" Dec 05 23:40:58 crc kubenswrapper[4734]: I1205 23:40:58.355644 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 05 23:40:58 crc kubenswrapper[4734]: I1205 23:40:58.827876 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-mbpqk"] Dec 05 23:40:58 crc kubenswrapper[4734]: I1205 23:40:58.830134 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mbpqk" Dec 05 23:40:58 crc kubenswrapper[4734]: I1205 23:40:58.832285 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 05 23:40:58 crc kubenswrapper[4734]: I1205 23:40:58.833050 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 05 23:40:58 crc kubenswrapper[4734]: I1205 23:40:58.844945 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mbpqk"] Dec 05 23:40:58 crc kubenswrapper[4734]: I1205 23:40:58.899763 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a44e1e9c-243f-4967-ac93-72db0dd02eb0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mbpqk\" (UID: \"a44e1e9c-243f-4967-ac93-72db0dd02eb0\") " pod="openstack/nova-cell0-cell-mapping-mbpqk" Dec 05 23:40:58 crc kubenswrapper[4734]: I1205 23:40:58.899969 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a44e1e9c-243f-4967-ac93-72db0dd02eb0-scripts\") pod \"nova-cell0-cell-mapping-mbpqk\" (UID: \"a44e1e9c-243f-4967-ac93-72db0dd02eb0\") " pod="openstack/nova-cell0-cell-mapping-mbpqk" Dec 05 23:40:58 crc kubenswrapper[4734]: I1205 23:40:58.900008 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a44e1e9c-243f-4967-ac93-72db0dd02eb0-config-data\") pod \"nova-cell0-cell-mapping-mbpqk\" (UID: \"a44e1e9c-243f-4967-ac93-72db0dd02eb0\") " pod="openstack/nova-cell0-cell-mapping-mbpqk" Dec 05 23:40:58 crc kubenswrapper[4734]: I1205 23:40:58.900126 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pccp\" (UniqueName: \"kubernetes.io/projected/a44e1e9c-243f-4967-ac93-72db0dd02eb0-kube-api-access-7pccp\") pod \"nova-cell0-cell-mapping-mbpqk\" (UID: \"a44e1e9c-243f-4967-ac93-72db0dd02eb0\") " pod="openstack/nova-cell0-cell-mapping-mbpqk" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.002585 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a44e1e9c-243f-4967-ac93-72db0dd02eb0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mbpqk\" (UID: \"a44e1e9c-243f-4967-ac93-72db0dd02eb0\") " pod="openstack/nova-cell0-cell-mapping-mbpqk" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.003036 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a44e1e9c-243f-4967-ac93-72db0dd02eb0-scripts\") pod \"nova-cell0-cell-mapping-mbpqk\" (UID: \"a44e1e9c-243f-4967-ac93-72db0dd02eb0\") " pod="openstack/nova-cell0-cell-mapping-mbpqk" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.003186 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a44e1e9c-243f-4967-ac93-72db0dd02eb0-config-data\") pod \"nova-cell0-cell-mapping-mbpqk\" (UID: \"a44e1e9c-243f-4967-ac93-72db0dd02eb0\") " pod="openstack/nova-cell0-cell-mapping-mbpqk" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.003332 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pccp\" (UniqueName: \"kubernetes.io/projected/a44e1e9c-243f-4967-ac93-72db0dd02eb0-kube-api-access-7pccp\") pod \"nova-cell0-cell-mapping-mbpqk\" (UID: \"a44e1e9c-243f-4967-ac93-72db0dd02eb0\") " pod="openstack/nova-cell0-cell-mapping-mbpqk" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.015650 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a44e1e9c-243f-4967-ac93-72db0dd02eb0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mbpqk\" (UID: \"a44e1e9c-243f-4967-ac93-72db0dd02eb0\") " pod="openstack/nova-cell0-cell-mapping-mbpqk" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.018601 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a44e1e9c-243f-4967-ac93-72db0dd02eb0-config-data\") pod \"nova-cell0-cell-mapping-mbpqk\" (UID: \"a44e1e9c-243f-4967-ac93-72db0dd02eb0\") " pod="openstack/nova-cell0-cell-mapping-mbpqk" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.029330 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a44e1e9c-243f-4967-ac93-72db0dd02eb0-scripts\") pod \"nova-cell0-cell-mapping-mbpqk\" (UID: \"a44e1e9c-243f-4967-ac93-72db0dd02eb0\") " pod="openstack/nova-cell0-cell-mapping-mbpqk" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.064295 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pccp\" (UniqueName: \"kubernetes.io/projected/a44e1e9c-243f-4967-ac93-72db0dd02eb0-kube-api-access-7pccp\") pod \"nova-cell0-cell-mapping-mbpqk\" (UID: \"a44e1e9c-243f-4967-ac93-72db0dd02eb0\") " pod="openstack/nova-cell0-cell-mapping-mbpqk" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.140501 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.142856 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.166224 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mbpqk" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.173200 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.193263 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.217914 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f5a50fc-e01d-43e6-9c99-9e1693246981-logs\") pod \"nova-api-0\" (UID: \"4f5a50fc-e01d-43e6-9c99-9e1693246981\") " pod="openstack/nova-api-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.217974 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5a50fc-e01d-43e6-9c99-9e1693246981-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4f5a50fc-e01d-43e6-9c99-9e1693246981\") " pod="openstack/nova-api-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.218005 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvglm\" (UniqueName: \"kubernetes.io/projected/4f5a50fc-e01d-43e6-9c99-9e1693246981-kube-api-access-vvglm\") pod \"nova-api-0\" (UID: \"4f5a50fc-e01d-43e6-9c99-9e1693246981\") " pod="openstack/nova-api-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.218019 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f5a50fc-e01d-43e6-9c99-9e1693246981-config-data\") pod \"nova-api-0\" (UID: \"4f5a50fc-e01d-43e6-9c99-9e1693246981\") " pod="openstack/nova-api-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.321236 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f5a50fc-e01d-43e6-9c99-9e1693246981-logs\") pod \"nova-api-0\" (UID: \"4f5a50fc-e01d-43e6-9c99-9e1693246981\") " pod="openstack/nova-api-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.321291 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5a50fc-e01d-43e6-9c99-9e1693246981-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4f5a50fc-e01d-43e6-9c99-9e1693246981\") " pod="openstack/nova-api-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.321323 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvglm\" (UniqueName: \"kubernetes.io/projected/4f5a50fc-e01d-43e6-9c99-9e1693246981-kube-api-access-vvglm\") pod \"nova-api-0\" (UID: \"4f5a50fc-e01d-43e6-9c99-9e1693246981\") " pod="openstack/nova-api-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.321349 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f5a50fc-e01d-43e6-9c99-9e1693246981-config-data\") pod \"nova-api-0\" (UID: \"4f5a50fc-e01d-43e6-9c99-9e1693246981\") " pod="openstack/nova-api-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.322606 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f5a50fc-e01d-43e6-9c99-9e1693246981-logs\") pod \"nova-api-0\" (UID: \"4f5a50fc-e01d-43e6-9c99-9e1693246981\") " pod="openstack/nova-api-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.333633 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5a50fc-e01d-43e6-9c99-9e1693246981-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4f5a50fc-e01d-43e6-9c99-9e1693246981\") " pod="openstack/nova-api-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.385395 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.402666 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.397269 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f5a50fc-e01d-43e6-9c99-9e1693246981-config-data\") pod \"nova-api-0\" (UID: \"4f5a50fc-e01d-43e6-9c99-9e1693246981\") " pod="openstack/nova-api-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.389460 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvglm\" (UniqueName: \"kubernetes.io/projected/4f5a50fc-e01d-43e6-9c99-9e1693246981-kube-api-access-vvglm\") pod \"nova-api-0\" (UID: \"4f5a50fc-e01d-43e6-9c99-9e1693246981\") " pod="openstack/nova-api-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.418573 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.453884 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.505980 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.507613 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.510995 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.528172 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.528309 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d14e3599-f754-4ce2-b846-90a7d7dfaea0-logs\") pod \"nova-metadata-0\" (UID: \"d14e3599-f754-4ce2-b846-90a7d7dfaea0\") " pod="openstack/nova-metadata-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.528367 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d14e3599-f754-4ce2-b846-90a7d7dfaea0-config-data\") pod \"nova-metadata-0\" (UID: \"d14e3599-f754-4ce2-b846-90a7d7dfaea0\") " pod="openstack/nova-metadata-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.528407 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d14e3599-f754-4ce2-b846-90a7d7dfaea0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d14e3599-f754-4ce2-b846-90a7d7dfaea0\") " pod="openstack/nova-metadata-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.528441 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6glsn\" (UniqueName: \"kubernetes.io/projected/d14e3599-f754-4ce2-b846-90a7d7dfaea0-kube-api-access-6glsn\") pod \"nova-metadata-0\" (UID: \"d14e3599-f754-4ce2-b846-90a7d7dfaea0\") " pod="openstack/nova-metadata-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.528626 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.570876 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.607606 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-pg2kh"] Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.609686 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.688584 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47c53a21-cffa-4f1f-8379-0dc6d805bc99-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"47c53a21-cffa-4f1f-8379-0dc6d805bc99\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.689160 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th6lp\" (UniqueName: \"kubernetes.io/projected/47c53a21-cffa-4f1f-8379-0dc6d805bc99-kube-api-access-th6lp\") pod \"nova-cell1-novncproxy-0\" (UID: \"47c53a21-cffa-4f1f-8379-0dc6d805bc99\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.689237 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d14e3599-f754-4ce2-b846-90a7d7dfaea0-logs\") pod \"nova-metadata-0\" (UID: \"d14e3599-f754-4ce2-b846-90a7d7dfaea0\") " pod="openstack/nova-metadata-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.689279 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d14e3599-f754-4ce2-b846-90a7d7dfaea0-config-data\") pod \"nova-metadata-0\" (UID: \"d14e3599-f754-4ce2-b846-90a7d7dfaea0\") " pod="openstack/nova-metadata-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.689343 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d14e3599-f754-4ce2-b846-90a7d7dfaea0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d14e3599-f754-4ce2-b846-90a7d7dfaea0\") " pod="openstack/nova-metadata-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.689374 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6glsn\" (UniqueName: \"kubernetes.io/projected/d14e3599-f754-4ce2-b846-90a7d7dfaea0-kube-api-access-6glsn\") pod \"nova-metadata-0\" (UID: \"d14e3599-f754-4ce2-b846-90a7d7dfaea0\") " pod="openstack/nova-metadata-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.689555 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47c53a21-cffa-4f1f-8379-0dc6d805bc99-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"47c53a21-cffa-4f1f-8379-0dc6d805bc99\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.723136 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d14e3599-f754-4ce2-b846-90a7d7dfaea0-logs\") pod \"nova-metadata-0\" (UID: \"d14e3599-f754-4ce2-b846-90a7d7dfaea0\") " pod="openstack/nova-metadata-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.803646 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6glsn\" (UniqueName: \"kubernetes.io/projected/d14e3599-f754-4ce2-b846-90a7d7dfaea0-kube-api-access-6glsn\") pod \"nova-metadata-0\" (UID: \"d14e3599-f754-4ce2-b846-90a7d7dfaea0\") " pod="openstack/nova-metadata-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.808006 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d14e3599-f754-4ce2-b846-90a7d7dfaea0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d14e3599-f754-4ce2-b846-90a7d7dfaea0\") " pod="openstack/nova-metadata-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.809787 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-pg2kh"] Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.813562 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d14e3599-f754-4ce2-b846-90a7d7dfaea0-config-data\") pod \"nova-metadata-0\" (UID: \"d14e3599-f754-4ce2-b846-90a7d7dfaea0\") " pod="openstack/nova-metadata-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.830718 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.840115 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-dns-svc\") pod \"dnsmasq-dns-757b4f8459-pg2kh\" (UID: \"e5484b60-2312-4287-9d50-4c15f83f9253\") " pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.840225 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-pg2kh\" (UID: \"e5484b60-2312-4287-9d50-4c15f83f9253\") " pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.840260 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47c53a21-cffa-4f1f-8379-0dc6d805bc99-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"47c53a21-cffa-4f1f-8379-0dc6d805bc99\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.840286 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57n7l\" (UniqueName: \"kubernetes.io/projected/e5484b60-2312-4287-9d50-4c15f83f9253-kube-api-access-57n7l\") pod \"dnsmasq-dns-757b4f8459-pg2kh\" (UID: \"e5484b60-2312-4287-9d50-4c15f83f9253\") " pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.840311 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-pg2kh\" (UID: \"e5484b60-2312-4287-9d50-4c15f83f9253\") " pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.840332 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th6lp\" (UniqueName: \"kubernetes.io/projected/47c53a21-cffa-4f1f-8379-0dc6d805bc99-kube-api-access-th6lp\") pod \"nova-cell1-novncproxy-0\" (UID: \"47c53a21-cffa-4f1f-8379-0dc6d805bc99\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.840385 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-config\") pod \"dnsmasq-dns-757b4f8459-pg2kh\" (UID: \"e5484b60-2312-4287-9d50-4c15f83f9253\") " pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.840488 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-pg2kh\" (UID: \"e5484b60-2312-4287-9d50-4c15f83f9253\") " pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.840566 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47c53a21-cffa-4f1f-8379-0dc6d805bc99-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"47c53a21-cffa-4f1f-8379-0dc6d805bc99\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.851378 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47c53a21-cffa-4f1f-8379-0dc6d805bc99-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"47c53a21-cffa-4f1f-8379-0dc6d805bc99\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.855637 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47c53a21-cffa-4f1f-8379-0dc6d805bc99-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"47c53a21-cffa-4f1f-8379-0dc6d805bc99\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.859490 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.867593 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.874870 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.875081 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.887391 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th6lp\" (UniqueName: \"kubernetes.io/projected/47c53a21-cffa-4f1f-8379-0dc6d805bc99-kube-api-access-th6lp\") pod \"nova-cell1-novncproxy-0\" (UID: \"47c53a21-cffa-4f1f-8379-0dc6d805bc99\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.945435 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57n7l\" (UniqueName: \"kubernetes.io/projected/e5484b60-2312-4287-9d50-4c15f83f9253-kube-api-access-57n7l\") pod \"dnsmasq-dns-757b4f8459-pg2kh\" (UID: \"e5484b60-2312-4287-9d50-4c15f83f9253\") " pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.945508 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-pg2kh\" (UID: \"e5484b60-2312-4287-9d50-4c15f83f9253\") " pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.945581 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-config\") pod \"dnsmasq-dns-757b4f8459-pg2kh\" (UID: \"e5484b60-2312-4287-9d50-4c15f83f9253\") " pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.945629 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-pg2kh\" (UID: \"e5484b60-2312-4287-9d50-4c15f83f9253\") " pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.945706 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-dns-svc\") pod \"dnsmasq-dns-757b4f8459-pg2kh\" (UID: \"e5484b60-2312-4287-9d50-4c15f83f9253\") " pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.945757 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-pg2kh\" (UID: \"e5484b60-2312-4287-9d50-4c15f83f9253\") " pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.947222 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-pg2kh\" (UID: \"e5484b60-2312-4287-9d50-4c15f83f9253\") " pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.947460 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-config\") pod \"dnsmasq-dns-757b4f8459-pg2kh\" (UID: \"e5484b60-2312-4287-9d50-4c15f83f9253\") " pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.960549 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-pg2kh\" (UID: \"e5484b60-2312-4287-9d50-4c15f83f9253\") " pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.963946 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-pg2kh\" (UID: \"e5484b60-2312-4287-9d50-4c15f83f9253\") " pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.963988 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-dns-svc\") pod \"dnsmasq-dns-757b4f8459-pg2kh\" (UID: \"e5484b60-2312-4287-9d50-4c15f83f9253\") " pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" Dec 05 23:40:59 crc kubenswrapper[4734]: I1205 23:40:59.977457 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57n7l\" (UniqueName: \"kubernetes.io/projected/e5484b60-2312-4287-9d50-4c15f83f9253-kube-api-access-57n7l\") pod \"dnsmasq-dns-757b4f8459-pg2kh\" (UID: \"e5484b60-2312-4287-9d50-4c15f83f9253\") " pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.052765 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl9qc\" (UniqueName: \"kubernetes.io/projected/5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe-kube-api-access-rl9qc\") pod \"nova-scheduler-0\" (UID: \"5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe\") " pod="openstack/nova-scheduler-0" Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.052834 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe-config-data\") pod \"nova-scheduler-0\" (UID: \"5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe\") " pod="openstack/nova-scheduler-0" Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.052889 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe\") " pod="openstack/nova-scheduler-0" Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.111122 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.155068 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl9qc\" (UniqueName: \"kubernetes.io/projected/5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe-kube-api-access-rl9qc\") pod \"nova-scheduler-0\" (UID: \"5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe\") " pod="openstack/nova-scheduler-0" Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.155487 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe-config-data\") pod \"nova-scheduler-0\" (UID: \"5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe\") " pod="openstack/nova-scheduler-0" Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.155599 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe\") " pod="openstack/nova-scheduler-0" Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.169361 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe-config-data\") pod \"nova-scheduler-0\" (UID: \"5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe\") " pod="openstack/nova-scheduler-0" Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.169991 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe\") " pod="openstack/nova-scheduler-0" Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.176191 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.184227 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl9qc\" (UniqueName: \"kubernetes.io/projected/5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe-kube-api-access-rl9qc\") pod \"nova-scheduler-0\" (UID: \"5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe\") " pod="openstack/nova-scheduler-0" Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.210410 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.309438 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mbpqk"] Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.383868 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.616054 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tr7x7"] Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.636924 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tr7x7"] Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.637163 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tr7x7" Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.643045 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.643290 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.700257 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.778349 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3111319c-91ad-46ab-847b-5f08b2d01cb5-scripts\") pod \"nova-cell1-conductor-db-sync-tr7x7\" (UID: \"3111319c-91ad-46ab-847b-5f08b2d01cb5\") " pod="openstack/nova-cell1-conductor-db-sync-tr7x7" Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.778732 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3111319c-91ad-46ab-847b-5f08b2d01cb5-config-data\") pod \"nova-cell1-conductor-db-sync-tr7x7\" (UID: \"3111319c-91ad-46ab-847b-5f08b2d01cb5\") " pod="openstack/nova-cell1-conductor-db-sync-tr7x7" Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.778820 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrpls\" (UniqueName: \"kubernetes.io/projected/3111319c-91ad-46ab-847b-5f08b2d01cb5-kube-api-access-zrpls\") pod \"nova-cell1-conductor-db-sync-tr7x7\" (UID: \"3111319c-91ad-46ab-847b-5f08b2d01cb5\") " pod="openstack/nova-cell1-conductor-db-sync-tr7x7" Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.779195 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3111319c-91ad-46ab-847b-5f08b2d01cb5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tr7x7\" (UID: \"3111319c-91ad-46ab-847b-5f08b2d01cb5\") " pod="openstack/nova-cell1-conductor-db-sync-tr7x7" Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.888228 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3111319c-91ad-46ab-847b-5f08b2d01cb5-scripts\") pod \"nova-cell1-conductor-db-sync-tr7x7\" (UID: \"3111319c-91ad-46ab-847b-5f08b2d01cb5\") " pod="openstack/nova-cell1-conductor-db-sync-tr7x7" Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.889012 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3111319c-91ad-46ab-847b-5f08b2d01cb5-config-data\") pod \"nova-cell1-conductor-db-sync-tr7x7\" (UID: \"3111319c-91ad-46ab-847b-5f08b2d01cb5\") " pod="openstack/nova-cell1-conductor-db-sync-tr7x7" Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.889076 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrpls\" (UniqueName: \"kubernetes.io/projected/3111319c-91ad-46ab-847b-5f08b2d01cb5-kube-api-access-zrpls\") pod \"nova-cell1-conductor-db-sync-tr7x7\" (UID: \"3111319c-91ad-46ab-847b-5f08b2d01cb5\") " pod="openstack/nova-cell1-conductor-db-sync-tr7x7" Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.889234 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3111319c-91ad-46ab-847b-5f08b2d01cb5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tr7x7\" (UID: \"3111319c-91ad-46ab-847b-5f08b2d01cb5\") " pod="openstack/nova-cell1-conductor-db-sync-tr7x7" Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.908302 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3111319c-91ad-46ab-847b-5f08b2d01cb5-scripts\") pod \"nova-cell1-conductor-db-sync-tr7x7\" (UID: \"3111319c-91ad-46ab-847b-5f08b2d01cb5\") " pod="openstack/nova-cell1-conductor-db-sync-tr7x7" Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.909083 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3111319c-91ad-46ab-847b-5f08b2d01cb5-config-data\") pod \"nova-cell1-conductor-db-sync-tr7x7\" (UID: \"3111319c-91ad-46ab-847b-5f08b2d01cb5\") " pod="openstack/nova-cell1-conductor-db-sync-tr7x7" Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.910486 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3111319c-91ad-46ab-847b-5f08b2d01cb5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tr7x7\" (UID: \"3111319c-91ad-46ab-847b-5f08b2d01cb5\") " pod="openstack/nova-cell1-conductor-db-sync-tr7x7" Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.912839 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrpls\" (UniqueName: \"kubernetes.io/projected/3111319c-91ad-46ab-847b-5f08b2d01cb5-kube-api-access-zrpls\") pod \"nova-cell1-conductor-db-sync-tr7x7\" (UID: \"3111319c-91ad-46ab-847b-5f08b2d01cb5\") " pod="openstack/nova-cell1-conductor-db-sync-tr7x7" Dec 05 23:41:00 crc kubenswrapper[4734]: I1205 23:41:00.913292 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 23:41:00 crc kubenswrapper[4734]: W1205 23:41:00.932569 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47c53a21_cffa_4f1f_8379_0dc6d805bc99.slice/crio-68383d03cf73c839d17054de97d5bcafe601670ac62f95cc9815b939b30a0bb4 WatchSource:0}: Error finding container 68383d03cf73c839d17054de97d5bcafe601670ac62f95cc9815b939b30a0bb4: Status 404 returned error can't find the container with id 68383d03cf73c839d17054de97d5bcafe601670ac62f95cc9815b939b30a0bb4 Dec 05 23:41:01 crc kubenswrapper[4734]: I1205 23:41:00.997656 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tr7x7" Dec 05 23:41:01 crc kubenswrapper[4734]: I1205 23:41:00.999669 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-pg2kh"] Dec 05 23:41:01 crc kubenswrapper[4734]: I1205 23:41:01.134282 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d14e3599-f754-4ce2-b846-90a7d7dfaea0","Type":"ContainerStarted","Data":"aa253fb64b6068590b2d203e003b3d8f7459be7ec97188eab06c11ea67d000df"} Dec 05 23:41:01 crc kubenswrapper[4734]: I1205 23:41:01.141537 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 23:41:01 crc kubenswrapper[4734]: I1205 23:41:01.144223 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mbpqk" event={"ID":"a44e1e9c-243f-4967-ac93-72db0dd02eb0","Type":"ContainerStarted","Data":"c44f55b4f5d13633a352b4bc1c0a5397ffd26bf622ddbfc60fdb7b3ad6830200"} Dec 05 23:41:01 crc kubenswrapper[4734]: I1205 23:41:01.144275 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mbpqk" event={"ID":"a44e1e9c-243f-4967-ac93-72db0dd02eb0","Type":"ContainerStarted","Data":"4e516d312e73fa7f57e85b654c51a85682305d59e92cded350a0ba7779279435"} Dec 05 23:41:01 crc kubenswrapper[4734]: I1205 23:41:01.148658 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f5a50fc-e01d-43e6-9c99-9e1693246981","Type":"ContainerStarted","Data":"f91c715a157541b9f76c4249f10b528d1d65a00bea5b65e56d8407413e40cef8"} Dec 05 23:41:01 crc kubenswrapper[4734]: I1205 23:41:01.160911 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" event={"ID":"e5484b60-2312-4287-9d50-4c15f83f9253","Type":"ContainerStarted","Data":"68dde1277a98f29470bf9f7a0af5edf3d0660648981abb8e51eb5b1768df899e"} Dec 05 23:41:01 crc kubenswrapper[4734]: I1205 23:41:01.163576 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"47c53a21-cffa-4f1f-8379-0dc6d805bc99","Type":"ContainerStarted","Data":"68383d03cf73c839d17054de97d5bcafe601670ac62f95cc9815b939b30a0bb4"} Dec 05 23:41:01 crc kubenswrapper[4734]: I1205 23:41:01.184292 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-mbpqk" podStartSLOduration=3.184261689 podStartE2EDuration="3.184261689s" podCreationTimestamp="2025-12-05 23:40:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:41:01.174868111 +0000 UTC m=+1281.858272387" watchObservedRunningTime="2025-12-05 23:41:01.184261689 +0000 UTC m=+1281.867665975" Dec 05 23:41:01 crc kubenswrapper[4734]: I1205 23:41:01.595698 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tr7x7"] Dec 05 23:41:01 crc kubenswrapper[4734]: W1205 23:41:01.620547 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3111319c_91ad_46ab_847b_5f08b2d01cb5.slice/crio-13f7d683f27ad0f3231ee54b5018b20e79d32cc15ee74d4fb96b16a46255ab33 WatchSource:0}: Error finding container 13f7d683f27ad0f3231ee54b5018b20e79d32cc15ee74d4fb96b16a46255ab33: Status 404 returned error can't find the container with id 13f7d683f27ad0f3231ee54b5018b20e79d32cc15ee74d4fb96b16a46255ab33 Dec 05 23:41:02 crc kubenswrapper[4734]: I1205 23:41:02.197155 4734 generic.go:334] "Generic (PLEG): container finished" podID="e5484b60-2312-4287-9d50-4c15f83f9253" containerID="95e07d1707d70b60667d14b890c0a9e0f051ca8c3a5b32a21d9052fc17f66394" exitCode=0 Dec 05 23:41:02 crc kubenswrapper[4734]: I1205 23:41:02.197669 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" event={"ID":"e5484b60-2312-4287-9d50-4c15f83f9253","Type":"ContainerDied","Data":"95e07d1707d70b60667d14b890c0a9e0f051ca8c3a5b32a21d9052fc17f66394"} Dec 05 23:41:02 crc kubenswrapper[4734]: I1205 23:41:02.209413 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe","Type":"ContainerStarted","Data":"92589d2be56ae07d94afc8fe0a52d95a89da5f7e48a0a09df95255162e38fc29"} Dec 05 23:41:02 crc kubenswrapper[4734]: I1205 23:41:02.235967 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tr7x7" event={"ID":"3111319c-91ad-46ab-847b-5f08b2d01cb5","Type":"ContainerStarted","Data":"174ec252ff03fe6b25dab570d0b619bb332013779daf1f7f9acf87fea2e9e9d5"} Dec 05 23:41:02 crc kubenswrapper[4734]: I1205 23:41:02.236025 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tr7x7" event={"ID":"3111319c-91ad-46ab-847b-5f08b2d01cb5","Type":"ContainerStarted","Data":"13f7d683f27ad0f3231ee54b5018b20e79d32cc15ee74d4fb96b16a46255ab33"} Dec 05 23:41:02 crc kubenswrapper[4734]: I1205 23:41:02.289722 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-tr7x7" podStartSLOduration=2.289678863 podStartE2EDuration="2.289678863s" podCreationTimestamp="2025-12-05 23:41:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:41:02.275301655 +0000 UTC m=+1282.958705931" watchObservedRunningTime="2025-12-05 23:41:02.289678863 +0000 UTC m=+1282.973083139" Dec 05 23:41:03 crc kubenswrapper[4734]: I1205 23:41:03.258059 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" event={"ID":"e5484b60-2312-4287-9d50-4c15f83f9253","Type":"ContainerStarted","Data":"ece29f0471793b187008d55962f5263fc66b1c51ef70bbe010b455f83f4f8f38"} Dec 05 23:41:03 crc kubenswrapper[4734]: I1205 23:41:03.258805 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" Dec 05 23:41:03 crc kubenswrapper[4734]: I1205 23:41:03.290224 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" podStartSLOduration=4.290190963 podStartE2EDuration="4.290190963s" podCreationTimestamp="2025-12-05 23:40:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:41:03.283184003 +0000 UTC m=+1283.966588279" watchObservedRunningTime="2025-12-05 23:41:03.290190963 +0000 UTC m=+1283.973595239" Dec 05 23:41:03 crc kubenswrapper[4734]: I1205 23:41:03.587727 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 23:41:03 crc kubenswrapper[4734]: I1205 23:41:03.653061 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 23:41:06 crc kubenswrapper[4734]: I1205 23:41:06.293046 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d14e3599-f754-4ce2-b846-90a7d7dfaea0" containerName="nova-metadata-log" containerID="cri-o://7e0841fde4c8c4fd660cca61da33ab5ba8a5d312bdc57ce500044a4b201366be" gracePeriod=30 Dec 05 23:41:06 crc kubenswrapper[4734]: I1205 23:41:06.293617 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d14e3599-f754-4ce2-b846-90a7d7dfaea0","Type":"ContainerStarted","Data":"148fae7be0bb6569ce5f4f043b67c28e8608610ee993a2dd7181bce1844bf8b8"} Dec 05 23:41:06 crc kubenswrapper[4734]: I1205 23:41:06.296878 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d14e3599-f754-4ce2-b846-90a7d7dfaea0","Type":"ContainerStarted","Data":"7e0841fde4c8c4fd660cca61da33ab5ba8a5d312bdc57ce500044a4b201366be"} Dec 05 23:41:06 crc kubenswrapper[4734]: I1205 23:41:06.296922 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f5a50fc-e01d-43e6-9c99-9e1693246981","Type":"ContainerStarted","Data":"677e986abeb971a76a1013f9820694efc92542a64a8325609213baa440d4efc4"} Dec 05 23:41:06 crc kubenswrapper[4734]: I1205 23:41:06.296943 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f5a50fc-e01d-43e6-9c99-9e1693246981","Type":"ContainerStarted","Data":"411d063df803029c8239279e56e7d050cc4e5c75b74d6e0b5457fb3e42051d9b"} Dec 05 23:41:06 crc kubenswrapper[4734]: I1205 23:41:06.294084 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d14e3599-f754-4ce2-b846-90a7d7dfaea0" containerName="nova-metadata-metadata" containerID="cri-o://148fae7be0bb6569ce5f4f043b67c28e8608610ee993a2dd7181bce1844bf8b8" gracePeriod=30 Dec 05 23:41:06 crc kubenswrapper[4734]: I1205 23:41:06.299247 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"47c53a21-cffa-4f1f-8379-0dc6d805bc99","Type":"ContainerStarted","Data":"25cd2e727ef56bc8b06e59a125ebf10af81582d3e0e479cb7a45294050547b08"} Dec 05 23:41:06 crc kubenswrapper[4734]: I1205 23:41:06.299371 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="47c53a21-cffa-4f1f-8379-0dc6d805bc99" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://25cd2e727ef56bc8b06e59a125ebf10af81582d3e0e479cb7a45294050547b08" gracePeriod=30 Dec 05 23:41:06 crc kubenswrapper[4734]: I1205 23:41:06.303248 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe","Type":"ContainerStarted","Data":"45c51140dc9b89c59442afd9e329a1d5eff3e47bd65c836a65343ec08bc269b0"} Dec 05 23:41:06 crc kubenswrapper[4734]: I1205 23:41:06.353737 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.174520099 podStartE2EDuration="7.353707635s" podCreationTimestamp="2025-12-05 23:40:59 +0000 UTC" firstStartedPulling="2025-12-05 23:41:00.70010735 +0000 UTC m=+1281.383511626" lastFinishedPulling="2025-12-05 23:41:04.879294886 +0000 UTC m=+1285.562699162" observedRunningTime="2025-12-05 23:41:06.329289202 +0000 UTC m=+1287.012693498" watchObservedRunningTime="2025-12-05 23:41:06.353707635 +0000 UTC m=+1287.037111931" Dec 05 23:41:06 crc kubenswrapper[4734]: I1205 23:41:06.359882 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.960650579 podStartE2EDuration="7.359858004s" podCreationTimestamp="2025-12-05 23:40:59 +0000 UTC" firstStartedPulling="2025-12-05 23:41:00.485128872 +0000 UTC m=+1281.168533148" lastFinishedPulling="2025-12-05 23:41:04.884336297 +0000 UTC m=+1285.567740573" observedRunningTime="2025-12-05 23:41:06.346269214 +0000 UTC m=+1287.029673510" watchObservedRunningTime="2025-12-05 23:41:06.359858004 +0000 UTC m=+1287.043262300" Dec 05 23:41:06 crc kubenswrapper[4734]: I1205 23:41:06.380633 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.448808976 podStartE2EDuration="7.380611458s" podCreationTimestamp="2025-12-05 23:40:59 +0000 UTC" firstStartedPulling="2025-12-05 23:41:00.947123124 +0000 UTC m=+1281.630527400" lastFinishedPulling="2025-12-05 23:41:04.878925606 +0000 UTC m=+1285.562329882" observedRunningTime="2025-12-05 23:41:06.378041105 +0000 UTC m=+1287.061445401" watchObservedRunningTime="2025-12-05 23:41:06.380611458 +0000 UTC m=+1287.064015734" Dec 05 23:41:06 crc kubenswrapper[4734]: I1205 23:41:06.415412 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.746506179 podStartE2EDuration="7.415378021s" podCreationTimestamp="2025-12-05 23:40:59 +0000 UTC" firstStartedPulling="2025-12-05 23:41:01.208441145 +0000 UTC m=+1281.891845421" lastFinishedPulling="2025-12-05 23:41:04.877312987 +0000 UTC m=+1285.560717263" observedRunningTime="2025-12-05 23:41:06.401928105 +0000 UTC m=+1287.085332381" watchObservedRunningTime="2025-12-05 23:41:06.415378021 +0000 UTC m=+1287.098782297" Dec 05 23:41:06 crc kubenswrapper[4734]: I1205 23:41:06.981027 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.073463 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d14e3599-f754-4ce2-b846-90a7d7dfaea0-logs\") pod \"d14e3599-f754-4ce2-b846-90a7d7dfaea0\" (UID: \"d14e3599-f754-4ce2-b846-90a7d7dfaea0\") " Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.073687 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6glsn\" (UniqueName: \"kubernetes.io/projected/d14e3599-f754-4ce2-b846-90a7d7dfaea0-kube-api-access-6glsn\") pod \"d14e3599-f754-4ce2-b846-90a7d7dfaea0\" (UID: \"d14e3599-f754-4ce2-b846-90a7d7dfaea0\") " Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.073767 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d14e3599-f754-4ce2-b846-90a7d7dfaea0-combined-ca-bundle\") pod \"d14e3599-f754-4ce2-b846-90a7d7dfaea0\" (UID: \"d14e3599-f754-4ce2-b846-90a7d7dfaea0\") " Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.073880 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d14e3599-f754-4ce2-b846-90a7d7dfaea0-config-data\") pod \"d14e3599-f754-4ce2-b846-90a7d7dfaea0\" (UID: \"d14e3599-f754-4ce2-b846-90a7d7dfaea0\") " Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.074667 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d14e3599-f754-4ce2-b846-90a7d7dfaea0-logs" (OuterVolumeSpecName: "logs") pod "d14e3599-f754-4ce2-b846-90a7d7dfaea0" (UID: "d14e3599-f754-4ce2-b846-90a7d7dfaea0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.087386 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d14e3599-f754-4ce2-b846-90a7d7dfaea0-kube-api-access-6glsn" (OuterVolumeSpecName: "kube-api-access-6glsn") pod "d14e3599-f754-4ce2-b846-90a7d7dfaea0" (UID: "d14e3599-f754-4ce2-b846-90a7d7dfaea0"). InnerVolumeSpecName "kube-api-access-6glsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.109800 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d14e3599-f754-4ce2-b846-90a7d7dfaea0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d14e3599-f754-4ce2-b846-90a7d7dfaea0" (UID: "d14e3599-f754-4ce2-b846-90a7d7dfaea0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.137480 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d14e3599-f754-4ce2-b846-90a7d7dfaea0-config-data" (OuterVolumeSpecName: "config-data") pod "d14e3599-f754-4ce2-b846-90a7d7dfaea0" (UID: "d14e3599-f754-4ce2-b846-90a7d7dfaea0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.176899 4734 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d14e3599-f754-4ce2-b846-90a7d7dfaea0-logs\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.176953 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6glsn\" (UniqueName: \"kubernetes.io/projected/d14e3599-f754-4ce2-b846-90a7d7dfaea0-kube-api-access-6glsn\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.176973 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d14e3599-f754-4ce2-b846-90a7d7dfaea0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.176985 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d14e3599-f754-4ce2-b846-90a7d7dfaea0-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.314486 4734 generic.go:334] "Generic (PLEG): container finished" podID="d14e3599-f754-4ce2-b846-90a7d7dfaea0" containerID="148fae7be0bb6569ce5f4f043b67c28e8608610ee993a2dd7181bce1844bf8b8" exitCode=0 Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.314535 4734 generic.go:334] "Generic (PLEG): container finished" podID="d14e3599-f754-4ce2-b846-90a7d7dfaea0" containerID="7e0841fde4c8c4fd660cca61da33ab5ba8a5d312bdc57ce500044a4b201366be" exitCode=143 Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.314584 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d14e3599-f754-4ce2-b846-90a7d7dfaea0","Type":"ContainerDied","Data":"148fae7be0bb6569ce5f4f043b67c28e8608610ee993a2dd7181bce1844bf8b8"} Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.314636 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d14e3599-f754-4ce2-b846-90a7d7dfaea0","Type":"ContainerDied","Data":"7e0841fde4c8c4fd660cca61da33ab5ba8a5d312bdc57ce500044a4b201366be"} Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.314647 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d14e3599-f754-4ce2-b846-90a7d7dfaea0","Type":"ContainerDied","Data":"aa253fb64b6068590b2d203e003b3d8f7459be7ec97188eab06c11ea67d000df"} Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.314667 4734 scope.go:117] "RemoveContainer" containerID="148fae7be0bb6569ce5f4f043b67c28e8608610ee993a2dd7181bce1844bf8b8" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.316180 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.349888 4734 scope.go:117] "RemoveContainer" containerID="7e0841fde4c8c4fd660cca61da33ab5ba8a5d312bdc57ce500044a4b201366be" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.370774 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.387911 4734 scope.go:117] "RemoveContainer" containerID="148fae7be0bb6569ce5f4f043b67c28e8608610ee993a2dd7181bce1844bf8b8" Dec 05 23:41:07 crc kubenswrapper[4734]: E1205 23:41:07.388551 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"148fae7be0bb6569ce5f4f043b67c28e8608610ee993a2dd7181bce1844bf8b8\": container with ID starting with 148fae7be0bb6569ce5f4f043b67c28e8608610ee993a2dd7181bce1844bf8b8 not found: ID does not exist" containerID="148fae7be0bb6569ce5f4f043b67c28e8608610ee993a2dd7181bce1844bf8b8" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.388612 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"148fae7be0bb6569ce5f4f043b67c28e8608610ee993a2dd7181bce1844bf8b8"} err="failed to get container status \"148fae7be0bb6569ce5f4f043b67c28e8608610ee993a2dd7181bce1844bf8b8\": rpc error: code = NotFound desc = could not find container \"148fae7be0bb6569ce5f4f043b67c28e8608610ee993a2dd7181bce1844bf8b8\": container with ID starting with 148fae7be0bb6569ce5f4f043b67c28e8608610ee993a2dd7181bce1844bf8b8 not found: ID does not exist" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.388645 4734 scope.go:117] "RemoveContainer" containerID="7e0841fde4c8c4fd660cca61da33ab5ba8a5d312bdc57ce500044a4b201366be" Dec 05 23:41:07 crc kubenswrapper[4734]: E1205 23:41:07.388950 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e0841fde4c8c4fd660cca61da33ab5ba8a5d312bdc57ce500044a4b201366be\": container with ID starting with 7e0841fde4c8c4fd660cca61da33ab5ba8a5d312bdc57ce500044a4b201366be not found: ID does not exist" containerID="7e0841fde4c8c4fd660cca61da33ab5ba8a5d312bdc57ce500044a4b201366be" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.388983 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e0841fde4c8c4fd660cca61da33ab5ba8a5d312bdc57ce500044a4b201366be"} err="failed to get container status \"7e0841fde4c8c4fd660cca61da33ab5ba8a5d312bdc57ce500044a4b201366be\": rpc error: code = NotFound desc = could not find container \"7e0841fde4c8c4fd660cca61da33ab5ba8a5d312bdc57ce500044a4b201366be\": container with ID starting with 7e0841fde4c8c4fd660cca61da33ab5ba8a5d312bdc57ce500044a4b201366be not found: ID does not exist" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.389006 4734 scope.go:117] "RemoveContainer" containerID="148fae7be0bb6569ce5f4f043b67c28e8608610ee993a2dd7181bce1844bf8b8" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.389361 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"148fae7be0bb6569ce5f4f043b67c28e8608610ee993a2dd7181bce1844bf8b8"} err="failed to get container status \"148fae7be0bb6569ce5f4f043b67c28e8608610ee993a2dd7181bce1844bf8b8\": rpc error: code = NotFound desc = could not find container \"148fae7be0bb6569ce5f4f043b67c28e8608610ee993a2dd7181bce1844bf8b8\": container with ID starting with 148fae7be0bb6569ce5f4f043b67c28e8608610ee993a2dd7181bce1844bf8b8 not found: ID does not exist" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.389391 4734 scope.go:117] "RemoveContainer" containerID="7e0841fde4c8c4fd660cca61da33ab5ba8a5d312bdc57ce500044a4b201366be" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.390604 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e0841fde4c8c4fd660cca61da33ab5ba8a5d312bdc57ce500044a4b201366be"} err="failed to get container status \"7e0841fde4c8c4fd660cca61da33ab5ba8a5d312bdc57ce500044a4b201366be\": rpc error: code = NotFound desc = could not find container \"7e0841fde4c8c4fd660cca61da33ab5ba8a5d312bdc57ce500044a4b201366be\": container with ID starting with 7e0841fde4c8c4fd660cca61da33ab5ba8a5d312bdc57ce500044a4b201366be not found: ID does not exist" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.411755 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.423705 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 23:41:07 crc kubenswrapper[4734]: E1205 23:41:07.424389 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d14e3599-f754-4ce2-b846-90a7d7dfaea0" containerName="nova-metadata-log" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.424415 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="d14e3599-f754-4ce2-b846-90a7d7dfaea0" containerName="nova-metadata-log" Dec 05 23:41:07 crc kubenswrapper[4734]: E1205 23:41:07.424452 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d14e3599-f754-4ce2-b846-90a7d7dfaea0" containerName="nova-metadata-metadata" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.424463 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="d14e3599-f754-4ce2-b846-90a7d7dfaea0" containerName="nova-metadata-metadata" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.424846 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="d14e3599-f754-4ce2-b846-90a7d7dfaea0" containerName="nova-metadata-log" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.424882 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="d14e3599-f754-4ce2-b846-90a7d7dfaea0" containerName="nova-metadata-metadata" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.426438 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.433300 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.441581 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.442831 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.585687 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nstl6\" (UniqueName: \"kubernetes.io/projected/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-kube-api-access-nstl6\") pod \"nova-metadata-0\" (UID: \"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf\") " pod="openstack/nova-metadata-0" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.585881 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf\") " pod="openstack/nova-metadata-0" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.585920 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-config-data\") pod \"nova-metadata-0\" (UID: \"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf\") " pod="openstack/nova-metadata-0" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.585958 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-logs\") pod \"nova-metadata-0\" (UID: \"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf\") " pod="openstack/nova-metadata-0" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.586174 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf\") " pod="openstack/nova-metadata-0" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.634377 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d14e3599-f754-4ce2-b846-90a7d7dfaea0" path="/var/lib/kubelet/pods/d14e3599-f754-4ce2-b846-90a7d7dfaea0/volumes" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.689007 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-logs\") pod \"nova-metadata-0\" (UID: \"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf\") " pod="openstack/nova-metadata-0" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.689114 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf\") " pod="openstack/nova-metadata-0" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.689280 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nstl6\" (UniqueName: \"kubernetes.io/projected/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-kube-api-access-nstl6\") pod \"nova-metadata-0\" (UID: \"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf\") " pod="openstack/nova-metadata-0" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.689428 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf\") " pod="openstack/nova-metadata-0" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.689481 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-config-data\") pod \"nova-metadata-0\" (UID: \"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf\") " pod="openstack/nova-metadata-0" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.689806 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-logs\") pod \"nova-metadata-0\" (UID: \"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf\") " pod="openstack/nova-metadata-0" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.693294 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf\") " pod="openstack/nova-metadata-0" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.693616 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-config-data\") pod \"nova-metadata-0\" (UID: \"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf\") " pod="openstack/nova-metadata-0" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.694021 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf\") " pod="openstack/nova-metadata-0" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.716929 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nstl6\" (UniqueName: \"kubernetes.io/projected/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-kube-api-access-nstl6\") pod \"nova-metadata-0\" (UID: \"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf\") " pod="openstack/nova-metadata-0" Dec 05 23:41:07 crc kubenswrapper[4734]: I1205 23:41:07.749483 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 23:41:08 crc kubenswrapper[4734]: I1205 23:41:08.290209 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 23:41:08 crc kubenswrapper[4734]: I1205 23:41:08.336093 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf","Type":"ContainerStarted","Data":"433c43e9ec47b627bec1ee1cbf851804432d2939336aa7203b1294afb024411d"} Dec 05 23:41:09 crc kubenswrapper[4734]: I1205 23:41:09.349941 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf","Type":"ContainerStarted","Data":"bda03de33a6d7e1a1cc14f71a3904c85f82b7bfd082e36b16af6c8c4b24b4c9a"} Dec 05 23:41:09 crc kubenswrapper[4734]: I1205 23:41:09.350392 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf","Type":"ContainerStarted","Data":"300d4044b4b257f6ff4bfcafd20a5ce952c2e228635996d4885abf1864231afc"} Dec 05 23:41:09 crc kubenswrapper[4734]: I1205 23:41:09.381246 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.381214533 podStartE2EDuration="2.381214533s" podCreationTimestamp="2025-12-05 23:41:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:41:09.376251722 +0000 UTC m=+1290.059656008" watchObservedRunningTime="2025-12-05 23:41:09.381214533 +0000 UTC m=+1290.064618809" Dec 05 23:41:09 crc kubenswrapper[4734]: I1205 23:41:09.512056 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 23:41:09 crc kubenswrapper[4734]: I1205 23:41:09.512173 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 23:41:10 crc kubenswrapper[4734]: I1205 23:41:10.114197 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" Dec 05 23:41:10 crc kubenswrapper[4734]: I1205 23:41:10.176887 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:41:10 crc kubenswrapper[4734]: I1205 23:41:10.188663 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-d6vvd"] Dec 05 23:41:10 crc kubenswrapper[4734]: I1205 23:41:10.189342 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" podUID="3850ca0d-4d1c-4b14-8633-f313cbb09401" containerName="dnsmasq-dns" containerID="cri-o://ec05b4c35d592b979f2fc36dd2e8ef72fd5ba0b5bf2ea588bf3bb7c682b79057" gracePeriod=10 Dec 05 23:41:10 crc kubenswrapper[4734]: I1205 23:41:10.213071 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 23:41:10 crc kubenswrapper[4734]: I1205 23:41:10.213124 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 23:41:10 crc kubenswrapper[4734]: I1205 23:41:10.282800 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 23:41:10 crc kubenswrapper[4734]: I1205 23:41:10.395511 4734 generic.go:334] "Generic (PLEG): container finished" podID="3850ca0d-4d1c-4b14-8633-f313cbb09401" containerID="ec05b4c35d592b979f2fc36dd2e8ef72fd5ba0b5bf2ea588bf3bb7c682b79057" exitCode=0 Dec 05 23:41:10 crc kubenswrapper[4734]: I1205 23:41:10.395623 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" event={"ID":"3850ca0d-4d1c-4b14-8633-f313cbb09401","Type":"ContainerDied","Data":"ec05b4c35d592b979f2fc36dd2e8ef72fd5ba0b5bf2ea588bf3bb7c682b79057"} Dec 05 23:41:10 crc kubenswrapper[4734]: I1205 23:41:10.402278 4734 generic.go:334] "Generic (PLEG): container finished" podID="3111319c-91ad-46ab-847b-5f08b2d01cb5" containerID="174ec252ff03fe6b25dab570d0b619bb332013779daf1f7f9acf87fea2e9e9d5" exitCode=0 Dec 05 23:41:10 crc kubenswrapper[4734]: I1205 23:41:10.402598 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tr7x7" event={"ID":"3111319c-91ad-46ab-847b-5f08b2d01cb5","Type":"ContainerDied","Data":"174ec252ff03fe6b25dab570d0b619bb332013779daf1f7f9acf87fea2e9e9d5"} Dec 05 23:41:10 crc kubenswrapper[4734]: I1205 23:41:10.474807 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 23:41:10 crc kubenswrapper[4734]: I1205 23:41:10.596357 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4f5a50fc-e01d-43e6-9c99-9e1693246981" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 23:41:10 crc kubenswrapper[4734]: I1205 23:41:10.596753 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4f5a50fc-e01d-43e6-9c99-9e1693246981" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 23:41:10 crc kubenswrapper[4734]: I1205 23:41:10.867544 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" Dec 05 23:41:11 crc kubenswrapper[4734]: I1205 23:41:11.011390 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-dns-swift-storage-0\") pod \"3850ca0d-4d1c-4b14-8633-f313cbb09401\" (UID: \"3850ca0d-4d1c-4b14-8633-f313cbb09401\") " Dec 05 23:41:11 crc kubenswrapper[4734]: I1205 23:41:11.012557 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-config\") pod \"3850ca0d-4d1c-4b14-8633-f313cbb09401\" (UID: \"3850ca0d-4d1c-4b14-8633-f313cbb09401\") " Dec 05 23:41:11 crc kubenswrapper[4734]: I1205 23:41:11.012836 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rkm6\" (UniqueName: \"kubernetes.io/projected/3850ca0d-4d1c-4b14-8633-f313cbb09401-kube-api-access-2rkm6\") pod \"3850ca0d-4d1c-4b14-8633-f313cbb09401\" (UID: \"3850ca0d-4d1c-4b14-8633-f313cbb09401\") " Dec 05 23:41:11 crc kubenswrapper[4734]: I1205 23:41:11.013036 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-dns-svc\") pod \"3850ca0d-4d1c-4b14-8633-f313cbb09401\" (UID: \"3850ca0d-4d1c-4b14-8633-f313cbb09401\") " Dec 05 23:41:11 crc kubenswrapper[4734]: I1205 23:41:11.013870 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-ovsdbserver-nb\") pod \"3850ca0d-4d1c-4b14-8633-f313cbb09401\" (UID: \"3850ca0d-4d1c-4b14-8633-f313cbb09401\") " Dec 05 23:41:11 crc kubenswrapper[4734]: I1205 23:41:11.014101 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-ovsdbserver-sb\") pod \"3850ca0d-4d1c-4b14-8633-f313cbb09401\" (UID: \"3850ca0d-4d1c-4b14-8633-f313cbb09401\") " Dec 05 23:41:11 crc kubenswrapper[4734]: I1205 23:41:11.031441 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3850ca0d-4d1c-4b14-8633-f313cbb09401-kube-api-access-2rkm6" (OuterVolumeSpecName: "kube-api-access-2rkm6") pod "3850ca0d-4d1c-4b14-8633-f313cbb09401" (UID: "3850ca0d-4d1c-4b14-8633-f313cbb09401"). InnerVolumeSpecName "kube-api-access-2rkm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:41:11 crc kubenswrapper[4734]: I1205 23:41:11.078582 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3850ca0d-4d1c-4b14-8633-f313cbb09401" (UID: "3850ca0d-4d1c-4b14-8633-f313cbb09401"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:41:11 crc kubenswrapper[4734]: I1205 23:41:11.084442 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-config" (OuterVolumeSpecName: "config") pod "3850ca0d-4d1c-4b14-8633-f313cbb09401" (UID: "3850ca0d-4d1c-4b14-8633-f313cbb09401"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:41:11 crc kubenswrapper[4734]: I1205 23:41:11.088614 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3850ca0d-4d1c-4b14-8633-f313cbb09401" (UID: "3850ca0d-4d1c-4b14-8633-f313cbb09401"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:41:11 crc kubenswrapper[4734]: I1205 23:41:11.099190 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3850ca0d-4d1c-4b14-8633-f313cbb09401" (UID: "3850ca0d-4d1c-4b14-8633-f313cbb09401"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:41:11 crc kubenswrapper[4734]: I1205 23:41:11.113805 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3850ca0d-4d1c-4b14-8633-f313cbb09401" (UID: "3850ca0d-4d1c-4b14-8633-f313cbb09401"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:41:11 crc kubenswrapper[4734]: I1205 23:41:11.117920 4734 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:11 crc kubenswrapper[4734]: I1205 23:41:11.117964 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:11 crc kubenswrapper[4734]: I1205 23:41:11.117975 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rkm6\" (UniqueName: \"kubernetes.io/projected/3850ca0d-4d1c-4b14-8633-f313cbb09401-kube-api-access-2rkm6\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:11 crc kubenswrapper[4734]: I1205 23:41:11.117985 4734 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:11 crc kubenswrapper[4734]: I1205 23:41:11.117994 4734 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:11 crc kubenswrapper[4734]: I1205 23:41:11.118003 4734 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3850ca0d-4d1c-4b14-8633-f313cbb09401-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:11 crc kubenswrapper[4734]: I1205 23:41:11.423998 4734 generic.go:334] "Generic (PLEG): container finished" podID="a44e1e9c-243f-4967-ac93-72db0dd02eb0" containerID="c44f55b4f5d13633a352b4bc1c0a5397ffd26bf622ddbfc60fdb7b3ad6830200" exitCode=0 Dec 05 23:41:11 crc kubenswrapper[4734]: I1205 23:41:11.424100 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mbpqk" event={"ID":"a44e1e9c-243f-4967-ac93-72db0dd02eb0","Type":"ContainerDied","Data":"c44f55b4f5d13633a352b4bc1c0a5397ffd26bf622ddbfc60fdb7b3ad6830200"} Dec 05 23:41:11 crc kubenswrapper[4734]: I1205 23:41:11.427750 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" Dec 05 23:41:11 crc kubenswrapper[4734]: I1205 23:41:11.430475 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-d6vvd" event={"ID":"3850ca0d-4d1c-4b14-8633-f313cbb09401","Type":"ContainerDied","Data":"5da559fb92548f22f6f171f488931e398bddde9a520ff67ee5563b0827a61b98"} Dec 05 23:41:11 crc kubenswrapper[4734]: I1205 23:41:11.430698 4734 scope.go:117] "RemoveContainer" containerID="ec05b4c35d592b979f2fc36dd2e8ef72fd5ba0b5bf2ea588bf3bb7c682b79057" Dec 05 23:41:11 crc kubenswrapper[4734]: I1205 23:41:11.494712 4734 scope.go:117] "RemoveContainer" containerID="e63d7900d1c92f0bb1e3bd75a47703ac3221b5646756f298631cf385233503ec" Dec 05 23:41:11 crc kubenswrapper[4734]: I1205 23:41:11.614847 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-d6vvd"] Dec 05 23:41:11 crc kubenswrapper[4734]: I1205 23:41:11.717016 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-d6vvd"] Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.037659 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tr7x7" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.150190 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3111319c-91ad-46ab-847b-5f08b2d01cb5-combined-ca-bundle\") pod \"3111319c-91ad-46ab-847b-5f08b2d01cb5\" (UID: \"3111319c-91ad-46ab-847b-5f08b2d01cb5\") " Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.151350 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3111319c-91ad-46ab-847b-5f08b2d01cb5-scripts\") pod \"3111319c-91ad-46ab-847b-5f08b2d01cb5\" (UID: \"3111319c-91ad-46ab-847b-5f08b2d01cb5\") " Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.151464 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3111319c-91ad-46ab-847b-5f08b2d01cb5-config-data\") pod \"3111319c-91ad-46ab-847b-5f08b2d01cb5\" (UID: \"3111319c-91ad-46ab-847b-5f08b2d01cb5\") " Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.151626 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrpls\" (UniqueName: \"kubernetes.io/projected/3111319c-91ad-46ab-847b-5f08b2d01cb5-kube-api-access-zrpls\") pod \"3111319c-91ad-46ab-847b-5f08b2d01cb5\" (UID: \"3111319c-91ad-46ab-847b-5f08b2d01cb5\") " Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.158395 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3111319c-91ad-46ab-847b-5f08b2d01cb5-kube-api-access-zrpls" (OuterVolumeSpecName: "kube-api-access-zrpls") pod "3111319c-91ad-46ab-847b-5f08b2d01cb5" (UID: "3111319c-91ad-46ab-847b-5f08b2d01cb5"). InnerVolumeSpecName "kube-api-access-zrpls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.158627 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3111319c-91ad-46ab-847b-5f08b2d01cb5-scripts" (OuterVolumeSpecName: "scripts") pod "3111319c-91ad-46ab-847b-5f08b2d01cb5" (UID: "3111319c-91ad-46ab-847b-5f08b2d01cb5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.190726 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3111319c-91ad-46ab-847b-5f08b2d01cb5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3111319c-91ad-46ab-847b-5f08b2d01cb5" (UID: "3111319c-91ad-46ab-847b-5f08b2d01cb5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.193208 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3111319c-91ad-46ab-847b-5f08b2d01cb5-config-data" (OuterVolumeSpecName: "config-data") pod "3111319c-91ad-46ab-847b-5f08b2d01cb5" (UID: "3111319c-91ad-46ab-847b-5f08b2d01cb5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.255009 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3111319c-91ad-46ab-847b-5f08b2d01cb5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.257482 4734 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3111319c-91ad-46ab-847b-5f08b2d01cb5-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.257724 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3111319c-91ad-46ab-847b-5f08b2d01cb5-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.257869 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrpls\" (UniqueName: \"kubernetes.io/projected/3111319c-91ad-46ab-847b-5f08b2d01cb5-kube-api-access-zrpls\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.448274 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tr7x7" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.454053 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tr7x7" event={"ID":"3111319c-91ad-46ab-847b-5f08b2d01cb5","Type":"ContainerDied","Data":"13f7d683f27ad0f3231ee54b5018b20e79d32cc15ee74d4fb96b16a46255ab33"} Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.454141 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13f7d683f27ad0f3231ee54b5018b20e79d32cc15ee74d4fb96b16a46255ab33" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.545321 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 23:41:12 crc kubenswrapper[4734]: E1205 23:41:12.546158 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3111319c-91ad-46ab-847b-5f08b2d01cb5" containerName="nova-cell1-conductor-db-sync" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.546235 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="3111319c-91ad-46ab-847b-5f08b2d01cb5" containerName="nova-cell1-conductor-db-sync" Dec 05 23:41:12 crc kubenswrapper[4734]: E1205 23:41:12.546306 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3850ca0d-4d1c-4b14-8633-f313cbb09401" containerName="dnsmasq-dns" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.546355 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="3850ca0d-4d1c-4b14-8633-f313cbb09401" containerName="dnsmasq-dns" Dec 05 23:41:12 crc kubenswrapper[4734]: E1205 23:41:12.546455 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3850ca0d-4d1c-4b14-8633-f313cbb09401" containerName="init" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.546509 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="3850ca0d-4d1c-4b14-8633-f313cbb09401" containerName="init" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.546786 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="3850ca0d-4d1c-4b14-8633-f313cbb09401" containerName="dnsmasq-dns" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.546851 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="3111319c-91ad-46ab-847b-5f08b2d01cb5" containerName="nova-cell1-conductor-db-sync" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.547694 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.551050 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.564867 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.572429 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhvvx\" (UniqueName: \"kubernetes.io/projected/b801f420-78a0-4564-9339-fca1170a01d7-kube-api-access-vhvvx\") pod \"nova-cell1-conductor-0\" (UID: \"b801f420-78a0-4564-9339-fca1170a01d7\") " pod="openstack/nova-cell1-conductor-0" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.572593 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b801f420-78a0-4564-9339-fca1170a01d7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b801f420-78a0-4564-9339-fca1170a01d7\") " pod="openstack/nova-cell1-conductor-0" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.572659 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b801f420-78a0-4564-9339-fca1170a01d7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b801f420-78a0-4564-9339-fca1170a01d7\") " pod="openstack/nova-cell1-conductor-0" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.674661 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhvvx\" (UniqueName: \"kubernetes.io/projected/b801f420-78a0-4564-9339-fca1170a01d7-kube-api-access-vhvvx\") pod \"nova-cell1-conductor-0\" (UID: \"b801f420-78a0-4564-9339-fca1170a01d7\") " pod="openstack/nova-cell1-conductor-0" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.675294 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b801f420-78a0-4564-9339-fca1170a01d7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b801f420-78a0-4564-9339-fca1170a01d7\") " pod="openstack/nova-cell1-conductor-0" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.675454 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b801f420-78a0-4564-9339-fca1170a01d7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b801f420-78a0-4564-9339-fca1170a01d7\") " pod="openstack/nova-cell1-conductor-0" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.690224 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b801f420-78a0-4564-9339-fca1170a01d7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b801f420-78a0-4564-9339-fca1170a01d7\") " pod="openstack/nova-cell1-conductor-0" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.690321 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b801f420-78a0-4564-9339-fca1170a01d7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b801f420-78a0-4564-9339-fca1170a01d7\") " pod="openstack/nova-cell1-conductor-0" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.733893 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhvvx\" (UniqueName: \"kubernetes.io/projected/b801f420-78a0-4564-9339-fca1170a01d7-kube-api-access-vhvvx\") pod \"nova-cell1-conductor-0\" (UID: \"b801f420-78a0-4564-9339-fca1170a01d7\") " pod="openstack/nova-cell1-conductor-0" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.749774 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.751199 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.901732 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 23:41:12 crc kubenswrapper[4734]: I1205 23:41:12.950245 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mbpqk" Dec 05 23:41:13 crc kubenswrapper[4734]: I1205 23:41:13.083901 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pccp\" (UniqueName: \"kubernetes.io/projected/a44e1e9c-243f-4967-ac93-72db0dd02eb0-kube-api-access-7pccp\") pod \"a44e1e9c-243f-4967-ac93-72db0dd02eb0\" (UID: \"a44e1e9c-243f-4967-ac93-72db0dd02eb0\") " Dec 05 23:41:13 crc kubenswrapper[4734]: I1205 23:41:13.084506 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a44e1e9c-243f-4967-ac93-72db0dd02eb0-config-data\") pod \"a44e1e9c-243f-4967-ac93-72db0dd02eb0\" (UID: \"a44e1e9c-243f-4967-ac93-72db0dd02eb0\") " Dec 05 23:41:13 crc kubenswrapper[4734]: I1205 23:41:13.084596 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a44e1e9c-243f-4967-ac93-72db0dd02eb0-combined-ca-bundle\") pod \"a44e1e9c-243f-4967-ac93-72db0dd02eb0\" (UID: \"a44e1e9c-243f-4967-ac93-72db0dd02eb0\") " Dec 05 23:41:13 crc kubenswrapper[4734]: I1205 23:41:13.084742 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a44e1e9c-243f-4967-ac93-72db0dd02eb0-scripts\") pod \"a44e1e9c-243f-4967-ac93-72db0dd02eb0\" (UID: \"a44e1e9c-243f-4967-ac93-72db0dd02eb0\") " Dec 05 23:41:13 crc kubenswrapper[4734]: I1205 23:41:13.092077 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a44e1e9c-243f-4967-ac93-72db0dd02eb0-scripts" (OuterVolumeSpecName: "scripts") pod "a44e1e9c-243f-4967-ac93-72db0dd02eb0" (UID: "a44e1e9c-243f-4967-ac93-72db0dd02eb0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:41:13 crc kubenswrapper[4734]: I1205 23:41:13.096344 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a44e1e9c-243f-4967-ac93-72db0dd02eb0-kube-api-access-7pccp" (OuterVolumeSpecName: "kube-api-access-7pccp") pod "a44e1e9c-243f-4967-ac93-72db0dd02eb0" (UID: "a44e1e9c-243f-4967-ac93-72db0dd02eb0"). InnerVolumeSpecName "kube-api-access-7pccp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:41:13 crc kubenswrapper[4734]: I1205 23:41:13.121698 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a44e1e9c-243f-4967-ac93-72db0dd02eb0-config-data" (OuterVolumeSpecName: "config-data") pod "a44e1e9c-243f-4967-ac93-72db0dd02eb0" (UID: "a44e1e9c-243f-4967-ac93-72db0dd02eb0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:41:13 crc kubenswrapper[4734]: I1205 23:41:13.133286 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a44e1e9c-243f-4967-ac93-72db0dd02eb0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a44e1e9c-243f-4967-ac93-72db0dd02eb0" (UID: "a44e1e9c-243f-4967-ac93-72db0dd02eb0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:41:13 crc kubenswrapper[4734]: I1205 23:41:13.187146 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a44e1e9c-243f-4967-ac93-72db0dd02eb0-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:13 crc kubenswrapper[4734]: I1205 23:41:13.187189 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a44e1e9c-243f-4967-ac93-72db0dd02eb0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:13 crc kubenswrapper[4734]: I1205 23:41:13.187204 4734 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a44e1e9c-243f-4967-ac93-72db0dd02eb0-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:13 crc kubenswrapper[4734]: I1205 23:41:13.187214 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pccp\" (UniqueName: \"kubernetes.io/projected/a44e1e9c-243f-4967-ac93-72db0dd02eb0-kube-api-access-7pccp\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:13 crc kubenswrapper[4734]: W1205 23:41:13.415367 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb801f420_78a0_4564_9339_fca1170a01d7.slice/crio-359fbb1cd2ec8eb145bb61387ab41db691e51d779d46264b930151affb30c8c1 WatchSource:0}: Error finding container 359fbb1cd2ec8eb145bb61387ab41db691e51d779d46264b930151affb30c8c1: Status 404 returned error can't find the container with id 359fbb1cd2ec8eb145bb61387ab41db691e51d779d46264b930151affb30c8c1 Dec 05 23:41:13 crc kubenswrapper[4734]: I1205 23:41:13.419698 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 23:41:13 crc kubenswrapper[4734]: I1205 23:41:13.478027 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b801f420-78a0-4564-9339-fca1170a01d7","Type":"ContainerStarted","Data":"359fbb1cd2ec8eb145bb61387ab41db691e51d779d46264b930151affb30c8c1"} Dec 05 23:41:13 crc kubenswrapper[4734]: I1205 23:41:13.484644 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mbpqk" Dec 05 23:41:13 crc kubenswrapper[4734]: I1205 23:41:13.484710 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mbpqk" event={"ID":"a44e1e9c-243f-4967-ac93-72db0dd02eb0","Type":"ContainerDied","Data":"4e516d312e73fa7f57e85b654c51a85682305d59e92cded350a0ba7779279435"} Dec 05 23:41:13 crc kubenswrapper[4734]: I1205 23:41:13.484746 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e516d312e73fa7f57e85b654c51a85682305d59e92cded350a0ba7779279435" Dec 05 23:41:13 crc kubenswrapper[4734]: I1205 23:41:13.590101 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 23:41:13 crc kubenswrapper[4734]: I1205 23:41:13.590412 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4f5a50fc-e01d-43e6-9c99-9e1693246981" containerName="nova-api-log" containerID="cri-o://411d063df803029c8239279e56e7d050cc4e5c75b74d6e0b5457fb3e42051d9b" gracePeriod=30 Dec 05 23:41:13 crc kubenswrapper[4734]: I1205 23:41:13.590562 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4f5a50fc-e01d-43e6-9c99-9e1693246981" containerName="nova-api-api" containerID="cri-o://677e986abeb971a76a1013f9820694efc92542a64a8325609213baa440d4efc4" gracePeriod=30 Dec 05 23:41:13 crc kubenswrapper[4734]: I1205 23:41:13.631484 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3850ca0d-4d1c-4b14-8633-f313cbb09401" path="/var/lib/kubelet/pods/3850ca0d-4d1c-4b14-8633-f313cbb09401/volumes" Dec 05 23:41:13 crc kubenswrapper[4734]: I1205 23:41:13.632258 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 23:41:13 crc kubenswrapper[4734]: I1205 23:41:13.632510 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe" containerName="nova-scheduler-scheduler" containerID="cri-o://45c51140dc9b89c59442afd9e329a1d5eff3e47bd65c836a65343ec08bc269b0" gracePeriod=30 Dec 05 23:41:13 crc kubenswrapper[4734]: I1205 23:41:13.683330 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 23:41:14 crc kubenswrapper[4734]: I1205 23:41:14.498587 4734 generic.go:334] "Generic (PLEG): container finished" podID="4f5a50fc-e01d-43e6-9c99-9e1693246981" containerID="411d063df803029c8239279e56e7d050cc4e5c75b74d6e0b5457fb3e42051d9b" exitCode=143 Dec 05 23:41:14 crc kubenswrapper[4734]: I1205 23:41:14.498696 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f5a50fc-e01d-43e6-9c99-9e1693246981","Type":"ContainerDied","Data":"411d063df803029c8239279e56e7d050cc4e5c75b74d6e0b5457fb3e42051d9b"} Dec 05 23:41:14 crc kubenswrapper[4734]: I1205 23:41:14.502721 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf" containerName="nova-metadata-log" containerID="cri-o://300d4044b4b257f6ff4bfcafd20a5ce952c2e228635996d4885abf1864231afc" gracePeriod=30 Dec 05 23:41:14 crc kubenswrapper[4734]: I1205 23:41:14.503240 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b801f420-78a0-4564-9339-fca1170a01d7","Type":"ContainerStarted","Data":"06398ae86266943f27fefe3cf6bb33efa5b3218b7f7a767917d9bcdb53307715"} Dec 05 23:41:14 crc kubenswrapper[4734]: I1205 23:41:14.503431 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 05 23:41:14 crc kubenswrapper[4734]: I1205 23:41:14.503551 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf" containerName="nova-metadata-metadata" containerID="cri-o://bda03de33a6d7e1a1cc14f71a3904c85f82b7bfd082e36b16af6c8c4b24b4c9a" gracePeriod=30 Dec 05 23:41:14 crc kubenswrapper[4734]: I1205 23:41:14.532043 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.532014197 podStartE2EDuration="2.532014197s" podCreationTimestamp="2025-12-05 23:41:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:41:14.527719523 +0000 UTC m=+1295.211123819" watchObservedRunningTime="2025-12-05 23:41:14.532014197 +0000 UTC m=+1295.215418473" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.102118 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 23:41:15 crc kubenswrapper[4734]: E1205 23:41:15.213930 4734 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="45c51140dc9b89c59442afd9e329a1d5eff3e47bd65c836a65343ec08bc269b0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 23:41:15 crc kubenswrapper[4734]: E1205 23:41:15.215158 4734 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="45c51140dc9b89c59442afd9e329a1d5eff3e47bd65c836a65343ec08bc269b0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 23:41:15 crc kubenswrapper[4734]: E1205 23:41:15.216300 4734 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="45c51140dc9b89c59442afd9e329a1d5eff3e47bd65c836a65343ec08bc269b0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 23:41:15 crc kubenswrapper[4734]: E1205 23:41:15.216338 4734 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe" containerName="nova-scheduler-scheduler" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.236769 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-logs\") pod \"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf\" (UID: \"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf\") " Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.236859 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nstl6\" (UniqueName: \"kubernetes.io/projected/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-kube-api-access-nstl6\") pod \"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf\" (UID: \"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf\") " Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.237144 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-combined-ca-bundle\") pod \"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf\" (UID: \"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf\") " Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.237189 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-nova-metadata-tls-certs\") pod \"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf\" (UID: \"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf\") " Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.237259 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-config-data\") pod \"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf\" (UID: \"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf\") " Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.237385 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-logs" (OuterVolumeSpecName: "logs") pod "4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf" (UID: "4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.237825 4734 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-logs\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.244888 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-kube-api-access-nstl6" (OuterVolumeSpecName: "kube-api-access-nstl6") pod "4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf" (UID: "4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf"). InnerVolumeSpecName "kube-api-access-nstl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.269199 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-config-data" (OuterVolumeSpecName: "config-data") pod "4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf" (UID: "4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.281141 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf" (UID: "4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.294414 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf" (UID: "4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.339899 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.339943 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nstl6\" (UniqueName: \"kubernetes.io/projected/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-kube-api-access-nstl6\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.339955 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.339967 4734 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.520784 4734 generic.go:334] "Generic (PLEG): container finished" podID="4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf" containerID="bda03de33a6d7e1a1cc14f71a3904c85f82b7bfd082e36b16af6c8c4b24b4c9a" exitCode=0 Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.520830 4734 generic.go:334] "Generic (PLEG): container finished" podID="4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf" containerID="300d4044b4b257f6ff4bfcafd20a5ce952c2e228635996d4885abf1864231afc" exitCode=143 Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.521810 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.522280 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf","Type":"ContainerDied","Data":"bda03de33a6d7e1a1cc14f71a3904c85f82b7bfd082e36b16af6c8c4b24b4c9a"} Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.522323 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf","Type":"ContainerDied","Data":"300d4044b4b257f6ff4bfcafd20a5ce952c2e228635996d4885abf1864231afc"} Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.522340 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf","Type":"ContainerDied","Data":"433c43e9ec47b627bec1ee1cbf851804432d2939336aa7203b1294afb024411d"} Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.522358 4734 scope.go:117] "RemoveContainer" containerID="bda03de33a6d7e1a1cc14f71a3904c85f82b7bfd082e36b16af6c8c4b24b4c9a" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.559220 4734 scope.go:117] "RemoveContainer" containerID="300d4044b4b257f6ff4bfcafd20a5ce952c2e228635996d4885abf1864231afc" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.590191 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.592676 4734 scope.go:117] "RemoveContainer" containerID="bda03de33a6d7e1a1cc14f71a3904c85f82b7bfd082e36b16af6c8c4b24b4c9a" Dec 05 23:41:15 crc kubenswrapper[4734]: E1205 23:41:15.593366 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bda03de33a6d7e1a1cc14f71a3904c85f82b7bfd082e36b16af6c8c4b24b4c9a\": container with ID starting with bda03de33a6d7e1a1cc14f71a3904c85f82b7bfd082e36b16af6c8c4b24b4c9a not found: ID does not exist" containerID="bda03de33a6d7e1a1cc14f71a3904c85f82b7bfd082e36b16af6c8c4b24b4c9a" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.593459 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bda03de33a6d7e1a1cc14f71a3904c85f82b7bfd082e36b16af6c8c4b24b4c9a"} err="failed to get container status \"bda03de33a6d7e1a1cc14f71a3904c85f82b7bfd082e36b16af6c8c4b24b4c9a\": rpc error: code = NotFound desc = could not find container \"bda03de33a6d7e1a1cc14f71a3904c85f82b7bfd082e36b16af6c8c4b24b4c9a\": container with ID starting with bda03de33a6d7e1a1cc14f71a3904c85f82b7bfd082e36b16af6c8c4b24b4c9a not found: ID does not exist" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.593497 4734 scope.go:117] "RemoveContainer" containerID="300d4044b4b257f6ff4bfcafd20a5ce952c2e228635996d4885abf1864231afc" Dec 05 23:41:15 crc kubenswrapper[4734]: E1205 23:41:15.593879 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"300d4044b4b257f6ff4bfcafd20a5ce952c2e228635996d4885abf1864231afc\": container with ID starting with 300d4044b4b257f6ff4bfcafd20a5ce952c2e228635996d4885abf1864231afc not found: ID does not exist" containerID="300d4044b4b257f6ff4bfcafd20a5ce952c2e228635996d4885abf1864231afc" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.593926 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"300d4044b4b257f6ff4bfcafd20a5ce952c2e228635996d4885abf1864231afc"} err="failed to get container status \"300d4044b4b257f6ff4bfcafd20a5ce952c2e228635996d4885abf1864231afc\": rpc error: code = NotFound desc = could not find container \"300d4044b4b257f6ff4bfcafd20a5ce952c2e228635996d4885abf1864231afc\": container with ID starting with 300d4044b4b257f6ff4bfcafd20a5ce952c2e228635996d4885abf1864231afc not found: ID does not exist" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.593961 4734 scope.go:117] "RemoveContainer" containerID="bda03de33a6d7e1a1cc14f71a3904c85f82b7bfd082e36b16af6c8c4b24b4c9a" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.594224 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bda03de33a6d7e1a1cc14f71a3904c85f82b7bfd082e36b16af6c8c4b24b4c9a"} err="failed to get container status \"bda03de33a6d7e1a1cc14f71a3904c85f82b7bfd082e36b16af6c8c4b24b4c9a\": rpc error: code = NotFound desc = could not find container \"bda03de33a6d7e1a1cc14f71a3904c85f82b7bfd082e36b16af6c8c4b24b4c9a\": container with ID starting with bda03de33a6d7e1a1cc14f71a3904c85f82b7bfd082e36b16af6c8c4b24b4c9a not found: ID does not exist" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.594245 4734 scope.go:117] "RemoveContainer" containerID="300d4044b4b257f6ff4bfcafd20a5ce952c2e228635996d4885abf1864231afc" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.594470 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"300d4044b4b257f6ff4bfcafd20a5ce952c2e228635996d4885abf1864231afc"} err="failed to get container status \"300d4044b4b257f6ff4bfcafd20a5ce952c2e228635996d4885abf1864231afc\": rpc error: code = NotFound desc = could not find container \"300d4044b4b257f6ff4bfcafd20a5ce952c2e228635996d4885abf1864231afc\": container with ID starting with 300d4044b4b257f6ff4bfcafd20a5ce952c2e228635996d4885abf1864231afc not found: ID does not exist" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.605349 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.627345 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf" path="/var/lib/kubelet/pods/4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf/volumes" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.632506 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 23:41:15 crc kubenswrapper[4734]: E1205 23:41:15.633135 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf" containerName="nova-metadata-log" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.633159 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf" containerName="nova-metadata-log" Dec 05 23:41:15 crc kubenswrapper[4734]: E1205 23:41:15.633176 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a44e1e9c-243f-4967-ac93-72db0dd02eb0" containerName="nova-manage" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.633184 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="a44e1e9c-243f-4967-ac93-72db0dd02eb0" containerName="nova-manage" Dec 05 23:41:15 crc kubenswrapper[4734]: E1205 23:41:15.633219 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf" containerName="nova-metadata-metadata" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.633229 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf" containerName="nova-metadata-metadata" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.633508 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf" containerName="nova-metadata-metadata" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.633551 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="a44e1e9c-243f-4967-ac93-72db0dd02eb0" containerName="nova-manage" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.633566 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="4efb809c-9dfb-4aa5-a4ee-d4d226d0b7bf" containerName="nova-metadata-log" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.634955 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.637839 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.638111 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.663631 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.749561 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvv4b\" (UniqueName: \"kubernetes.io/projected/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-kube-api-access-hvv4b\") pod \"nova-metadata-0\" (UID: \"1f9ea4e7-3820-41f2-9232-a79f9f1091ac\") " pod="openstack/nova-metadata-0" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.750132 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-logs\") pod \"nova-metadata-0\" (UID: \"1f9ea4e7-3820-41f2-9232-a79f9f1091ac\") " pod="openstack/nova-metadata-0" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.750269 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1f9ea4e7-3820-41f2-9232-a79f9f1091ac\") " pod="openstack/nova-metadata-0" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.750364 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1f9ea4e7-3820-41f2-9232-a79f9f1091ac\") " pod="openstack/nova-metadata-0" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.750473 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-config-data\") pod \"nova-metadata-0\" (UID: \"1f9ea4e7-3820-41f2-9232-a79f9f1091ac\") " pod="openstack/nova-metadata-0" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.852968 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-config-data\") pod \"nova-metadata-0\" (UID: \"1f9ea4e7-3820-41f2-9232-a79f9f1091ac\") " pod="openstack/nova-metadata-0" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.853953 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvv4b\" (UniqueName: \"kubernetes.io/projected/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-kube-api-access-hvv4b\") pod \"nova-metadata-0\" (UID: \"1f9ea4e7-3820-41f2-9232-a79f9f1091ac\") " pod="openstack/nova-metadata-0" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.854121 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-logs\") pod \"nova-metadata-0\" (UID: \"1f9ea4e7-3820-41f2-9232-a79f9f1091ac\") " pod="openstack/nova-metadata-0" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.854226 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1f9ea4e7-3820-41f2-9232-a79f9f1091ac\") " pod="openstack/nova-metadata-0" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.854326 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1f9ea4e7-3820-41f2-9232-a79f9f1091ac\") " pod="openstack/nova-metadata-0" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.855087 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-logs\") pod \"nova-metadata-0\" (UID: \"1f9ea4e7-3820-41f2-9232-a79f9f1091ac\") " pod="openstack/nova-metadata-0" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.859269 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-config-data\") pod \"nova-metadata-0\" (UID: \"1f9ea4e7-3820-41f2-9232-a79f9f1091ac\") " pod="openstack/nova-metadata-0" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.859382 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1f9ea4e7-3820-41f2-9232-a79f9f1091ac\") " pod="openstack/nova-metadata-0" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.860822 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1f9ea4e7-3820-41f2-9232-a79f9f1091ac\") " pod="openstack/nova-metadata-0" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.887212 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvv4b\" (UniqueName: \"kubernetes.io/projected/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-kube-api-access-hvv4b\") pod \"nova-metadata-0\" (UID: \"1f9ea4e7-3820-41f2-9232-a79f9f1091ac\") " pod="openstack/nova-metadata-0" Dec 05 23:41:15 crc kubenswrapper[4734]: I1205 23:41:15.996776 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 23:41:16 crc kubenswrapper[4734]: I1205 23:41:16.501907 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 23:41:16 crc kubenswrapper[4734]: W1205 23:41:16.508257 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f9ea4e7_3820_41f2_9232_a79f9f1091ac.slice/crio-6793fcd26d871a7d050c131cdc9974367cf22506787cc690c02d52e31cbae85b WatchSource:0}: Error finding container 6793fcd26d871a7d050c131cdc9974367cf22506787cc690c02d52e31cbae85b: Status 404 returned error can't find the container with id 6793fcd26d871a7d050c131cdc9974367cf22506787cc690c02d52e31cbae85b Dec 05 23:41:16 crc kubenswrapper[4734]: I1205 23:41:16.536852 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1f9ea4e7-3820-41f2-9232-a79f9f1091ac","Type":"ContainerStarted","Data":"6793fcd26d871a7d050c131cdc9974367cf22506787cc690c02d52e31cbae85b"} Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.139733 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.295114 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f5a50fc-e01d-43e6-9c99-9e1693246981-config-data\") pod \"4f5a50fc-e01d-43e6-9c99-9e1693246981\" (UID: \"4f5a50fc-e01d-43e6-9c99-9e1693246981\") " Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.295180 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f5a50fc-e01d-43e6-9c99-9e1693246981-logs\") pod \"4f5a50fc-e01d-43e6-9c99-9e1693246981\" (UID: \"4f5a50fc-e01d-43e6-9c99-9e1693246981\") " Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.295207 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvglm\" (UniqueName: \"kubernetes.io/projected/4f5a50fc-e01d-43e6-9c99-9e1693246981-kube-api-access-vvglm\") pod \"4f5a50fc-e01d-43e6-9c99-9e1693246981\" (UID: \"4f5a50fc-e01d-43e6-9c99-9e1693246981\") " Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.295252 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5a50fc-e01d-43e6-9c99-9e1693246981-combined-ca-bundle\") pod \"4f5a50fc-e01d-43e6-9c99-9e1693246981\" (UID: \"4f5a50fc-e01d-43e6-9c99-9e1693246981\") " Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.296111 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f5a50fc-e01d-43e6-9c99-9e1693246981-logs" (OuterVolumeSpecName: "logs") pod "4f5a50fc-e01d-43e6-9c99-9e1693246981" (UID: "4f5a50fc-e01d-43e6-9c99-9e1693246981"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.302666 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f5a50fc-e01d-43e6-9c99-9e1693246981-kube-api-access-vvglm" (OuterVolumeSpecName: "kube-api-access-vvglm") pod "4f5a50fc-e01d-43e6-9c99-9e1693246981" (UID: "4f5a50fc-e01d-43e6-9c99-9e1693246981"). InnerVolumeSpecName "kube-api-access-vvglm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.327113 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f5a50fc-e01d-43e6-9c99-9e1693246981-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f5a50fc-e01d-43e6-9c99-9e1693246981" (UID: "4f5a50fc-e01d-43e6-9c99-9e1693246981"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.329942 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f5a50fc-e01d-43e6-9c99-9e1693246981-config-data" (OuterVolumeSpecName: "config-data") pod "4f5a50fc-e01d-43e6-9c99-9e1693246981" (UID: "4f5a50fc-e01d-43e6-9c99-9e1693246981"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.397518 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f5a50fc-e01d-43e6-9c99-9e1693246981-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.397592 4734 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f5a50fc-e01d-43e6-9c99-9e1693246981-logs\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.397606 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvglm\" (UniqueName: \"kubernetes.io/projected/4f5a50fc-e01d-43e6-9c99-9e1693246981-kube-api-access-vvglm\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.397619 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5a50fc-e01d-43e6-9c99-9e1693246981-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.560682 4734 generic.go:334] "Generic (PLEG): container finished" podID="4f5a50fc-e01d-43e6-9c99-9e1693246981" containerID="677e986abeb971a76a1013f9820694efc92542a64a8325609213baa440d4efc4" exitCode=0 Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.560771 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f5a50fc-e01d-43e6-9c99-9e1693246981","Type":"ContainerDied","Data":"677e986abeb971a76a1013f9820694efc92542a64a8325609213baa440d4efc4"} Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.560823 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f5a50fc-e01d-43e6-9c99-9e1693246981","Type":"ContainerDied","Data":"f91c715a157541b9f76c4249f10b528d1d65a00bea5b65e56d8407413e40cef8"} Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.560846 4734 scope.go:117] "RemoveContainer" containerID="677e986abeb971a76a1013f9820694efc92542a64a8325609213baa440d4efc4" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.560997 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.569355 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1f9ea4e7-3820-41f2-9232-a79f9f1091ac","Type":"ContainerStarted","Data":"018ec382f9db53d69ad6e01565326aa41397b51fb465e9293103e638af38fc58"} Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.569888 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1f9ea4e7-3820-41f2-9232-a79f9f1091ac","Type":"ContainerStarted","Data":"162bda318be27c2b91bb6ad6b2d64f0f00ebc952462a8c3be681352cdb7c71e1"} Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.610377 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.610344517 podStartE2EDuration="2.610344517s" podCreationTimestamp="2025-12-05 23:41:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:41:17.60758018 +0000 UTC m=+1298.290984476" watchObservedRunningTime="2025-12-05 23:41:17.610344517 +0000 UTC m=+1298.293748793" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.612574 4734 scope.go:117] "RemoveContainer" containerID="411d063df803029c8239279e56e7d050cc4e5c75b74d6e0b5457fb3e42051d9b" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.662977 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.674293 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.689276 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 23:41:17 crc kubenswrapper[4734]: E1205 23:41:17.689935 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f5a50fc-e01d-43e6-9c99-9e1693246981" containerName="nova-api-log" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.689962 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f5a50fc-e01d-43e6-9c99-9e1693246981" containerName="nova-api-log" Dec 05 23:41:17 crc kubenswrapper[4734]: E1205 23:41:17.690178 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f5a50fc-e01d-43e6-9c99-9e1693246981" containerName="nova-api-api" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.690192 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f5a50fc-e01d-43e6-9c99-9e1693246981" containerName="nova-api-api" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.690595 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f5a50fc-e01d-43e6-9c99-9e1693246981" containerName="nova-api-log" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.690745 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f5a50fc-e01d-43e6-9c99-9e1693246981" containerName="nova-api-api" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.692480 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.698260 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.701807 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.813515 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7lgm\" (UniqueName: \"kubernetes.io/projected/b2236711-73b3-4063-8ca9-a349b15f26b9-kube-api-access-w7lgm\") pod \"nova-api-0\" (UID: \"b2236711-73b3-4063-8ca9-a349b15f26b9\") " pod="openstack/nova-api-0" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.813671 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2236711-73b3-4063-8ca9-a349b15f26b9-config-data\") pod \"nova-api-0\" (UID: \"b2236711-73b3-4063-8ca9-a349b15f26b9\") " pod="openstack/nova-api-0" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.814015 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2236711-73b3-4063-8ca9-a349b15f26b9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b2236711-73b3-4063-8ca9-a349b15f26b9\") " pod="openstack/nova-api-0" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.814068 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2236711-73b3-4063-8ca9-a349b15f26b9-logs\") pod \"nova-api-0\" (UID: \"b2236711-73b3-4063-8ca9-a349b15f26b9\") " pod="openstack/nova-api-0" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.834856 4734 scope.go:117] "RemoveContainer" containerID="677e986abeb971a76a1013f9820694efc92542a64a8325609213baa440d4efc4" Dec 05 23:41:17 crc kubenswrapper[4734]: E1205 23:41:17.835565 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"677e986abeb971a76a1013f9820694efc92542a64a8325609213baa440d4efc4\": container with ID starting with 677e986abeb971a76a1013f9820694efc92542a64a8325609213baa440d4efc4 not found: ID does not exist" containerID="677e986abeb971a76a1013f9820694efc92542a64a8325609213baa440d4efc4" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.835608 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"677e986abeb971a76a1013f9820694efc92542a64a8325609213baa440d4efc4"} err="failed to get container status \"677e986abeb971a76a1013f9820694efc92542a64a8325609213baa440d4efc4\": rpc error: code = NotFound desc = could not find container \"677e986abeb971a76a1013f9820694efc92542a64a8325609213baa440d4efc4\": container with ID starting with 677e986abeb971a76a1013f9820694efc92542a64a8325609213baa440d4efc4 not found: ID does not exist" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.835643 4734 scope.go:117] "RemoveContainer" containerID="411d063df803029c8239279e56e7d050cc4e5c75b74d6e0b5457fb3e42051d9b" Dec 05 23:41:17 crc kubenswrapper[4734]: E1205 23:41:17.836027 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"411d063df803029c8239279e56e7d050cc4e5c75b74d6e0b5457fb3e42051d9b\": container with ID starting with 411d063df803029c8239279e56e7d050cc4e5c75b74d6e0b5457fb3e42051d9b not found: ID does not exist" containerID="411d063df803029c8239279e56e7d050cc4e5c75b74d6e0b5457fb3e42051d9b" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.836061 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"411d063df803029c8239279e56e7d050cc4e5c75b74d6e0b5457fb3e42051d9b"} err="failed to get container status \"411d063df803029c8239279e56e7d050cc4e5c75b74d6e0b5457fb3e42051d9b\": rpc error: code = NotFound desc = could not find container \"411d063df803029c8239279e56e7d050cc4e5c75b74d6e0b5457fb3e42051d9b\": container with ID starting with 411d063df803029c8239279e56e7d050cc4e5c75b74d6e0b5457fb3e42051d9b not found: ID does not exist" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.916128 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2236711-73b3-4063-8ca9-a349b15f26b9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b2236711-73b3-4063-8ca9-a349b15f26b9\") " pod="openstack/nova-api-0" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.916217 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2236711-73b3-4063-8ca9-a349b15f26b9-logs\") pod \"nova-api-0\" (UID: \"b2236711-73b3-4063-8ca9-a349b15f26b9\") " pod="openstack/nova-api-0" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.916271 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7lgm\" (UniqueName: \"kubernetes.io/projected/b2236711-73b3-4063-8ca9-a349b15f26b9-kube-api-access-w7lgm\") pod \"nova-api-0\" (UID: \"b2236711-73b3-4063-8ca9-a349b15f26b9\") " pod="openstack/nova-api-0" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.916302 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2236711-73b3-4063-8ca9-a349b15f26b9-config-data\") pod \"nova-api-0\" (UID: \"b2236711-73b3-4063-8ca9-a349b15f26b9\") " pod="openstack/nova-api-0" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.917717 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2236711-73b3-4063-8ca9-a349b15f26b9-logs\") pod \"nova-api-0\" (UID: \"b2236711-73b3-4063-8ca9-a349b15f26b9\") " pod="openstack/nova-api-0" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.923427 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2236711-73b3-4063-8ca9-a349b15f26b9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b2236711-73b3-4063-8ca9-a349b15f26b9\") " pod="openstack/nova-api-0" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.923810 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2236711-73b3-4063-8ca9-a349b15f26b9-config-data\") pod \"nova-api-0\" (UID: \"b2236711-73b3-4063-8ca9-a349b15f26b9\") " pod="openstack/nova-api-0" Dec 05 23:41:17 crc kubenswrapper[4734]: I1205 23:41:17.944424 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7lgm\" (UniqueName: \"kubernetes.io/projected/b2236711-73b3-4063-8ca9-a349b15f26b9-kube-api-access-w7lgm\") pod \"nova-api-0\" (UID: \"b2236711-73b3-4063-8ca9-a349b15f26b9\") " pod="openstack/nova-api-0" Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.120088 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.136324 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.222509 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe-combined-ca-bundle\") pod \"5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe\" (UID: \"5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe\") " Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.222826 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe-config-data\") pod \"5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe\" (UID: \"5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe\") " Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.222986 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rl9qc\" (UniqueName: \"kubernetes.io/projected/5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe-kube-api-access-rl9qc\") pod \"5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe\" (UID: \"5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe\") " Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.229658 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe-kube-api-access-rl9qc" (OuterVolumeSpecName: "kube-api-access-rl9qc") pod "5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe" (UID: "5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe"). InnerVolumeSpecName "kube-api-access-rl9qc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.265832 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe-config-data" (OuterVolumeSpecName: "config-data") pod "5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe" (UID: "5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.267879 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe" (UID: "5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.325877 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.325911 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.325922 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rl9qc\" (UniqueName: \"kubernetes.io/projected/5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe-kube-api-access-rl9qc\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.583649 4734 generic.go:334] "Generic (PLEG): container finished" podID="5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe" containerID="45c51140dc9b89c59442afd9e329a1d5eff3e47bd65c836a65343ec08bc269b0" exitCode=0 Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.583748 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.583718 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe","Type":"ContainerDied","Data":"45c51140dc9b89c59442afd9e329a1d5eff3e47bd65c836a65343ec08bc269b0"} Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.584341 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe","Type":"ContainerDied","Data":"92589d2be56ae07d94afc8fe0a52d95a89da5f7e48a0a09df95255162e38fc29"} Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.584374 4734 scope.go:117] "RemoveContainer" containerID="45c51140dc9b89c59442afd9e329a1d5eff3e47bd65c836a65343ec08bc269b0" Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.609319 4734 scope.go:117] "RemoveContainer" containerID="45c51140dc9b89c59442afd9e329a1d5eff3e47bd65c836a65343ec08bc269b0" Dec 05 23:41:18 crc kubenswrapper[4734]: E1205 23:41:18.611361 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45c51140dc9b89c59442afd9e329a1d5eff3e47bd65c836a65343ec08bc269b0\": container with ID starting with 45c51140dc9b89c59442afd9e329a1d5eff3e47bd65c836a65343ec08bc269b0 not found: ID does not exist" containerID="45c51140dc9b89c59442afd9e329a1d5eff3e47bd65c836a65343ec08bc269b0" Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.611437 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45c51140dc9b89c59442afd9e329a1d5eff3e47bd65c836a65343ec08bc269b0"} err="failed to get container status \"45c51140dc9b89c59442afd9e329a1d5eff3e47bd65c836a65343ec08bc269b0\": rpc error: code = NotFound desc = could not find container \"45c51140dc9b89c59442afd9e329a1d5eff3e47bd65c836a65343ec08bc269b0\": container with ID starting with 45c51140dc9b89c59442afd9e329a1d5eff3e47bd65c836a65343ec08bc269b0 not found: ID does not exist" Dec 05 23:41:18 crc kubenswrapper[4734]: W1205 23:41:18.624357 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2236711_73b3_4063_8ca9_a349b15f26b9.slice/crio-73e26de52199e56ee293ea9cc1f8b8542294c11710a7a5be8d55e4756227e976 WatchSource:0}: Error finding container 73e26de52199e56ee293ea9cc1f8b8542294c11710a7a5be8d55e4756227e976: Status 404 returned error can't find the container with id 73e26de52199e56ee293ea9cc1f8b8542294c11710a7a5be8d55e4756227e976 Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.626581 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.637421 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.656494 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.670074 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 23:41:18 crc kubenswrapper[4734]: E1205 23:41:18.670931 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe" containerName="nova-scheduler-scheduler" Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.670964 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe" containerName="nova-scheduler-scheduler" Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.671242 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe" containerName="nova-scheduler-scheduler" Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.672344 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.678274 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.703830 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.734975 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plzcq\" (UniqueName: \"kubernetes.io/projected/adf827fa-f9ac-4c4b-bf90-65a59438d9b6-kube-api-access-plzcq\") pod \"nova-scheduler-0\" (UID: \"adf827fa-f9ac-4c4b-bf90-65a59438d9b6\") " pod="openstack/nova-scheduler-0" Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.735102 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adf827fa-f9ac-4c4b-bf90-65a59438d9b6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"adf827fa-f9ac-4c4b-bf90-65a59438d9b6\") " pod="openstack/nova-scheduler-0" Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.736053 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adf827fa-f9ac-4c4b-bf90-65a59438d9b6-config-data\") pod \"nova-scheduler-0\" (UID: \"adf827fa-f9ac-4c4b-bf90-65a59438d9b6\") " pod="openstack/nova-scheduler-0" Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.838518 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plzcq\" (UniqueName: \"kubernetes.io/projected/adf827fa-f9ac-4c4b-bf90-65a59438d9b6-kube-api-access-plzcq\") pod \"nova-scheduler-0\" (UID: \"adf827fa-f9ac-4c4b-bf90-65a59438d9b6\") " pod="openstack/nova-scheduler-0" Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.838661 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adf827fa-f9ac-4c4b-bf90-65a59438d9b6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"adf827fa-f9ac-4c4b-bf90-65a59438d9b6\") " pod="openstack/nova-scheduler-0" Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.838714 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adf827fa-f9ac-4c4b-bf90-65a59438d9b6-config-data\") pod \"nova-scheduler-0\" (UID: \"adf827fa-f9ac-4c4b-bf90-65a59438d9b6\") " pod="openstack/nova-scheduler-0" Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.849542 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adf827fa-f9ac-4c4b-bf90-65a59438d9b6-config-data\") pod \"nova-scheduler-0\" (UID: \"adf827fa-f9ac-4c4b-bf90-65a59438d9b6\") " pod="openstack/nova-scheduler-0" Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.849575 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adf827fa-f9ac-4c4b-bf90-65a59438d9b6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"adf827fa-f9ac-4c4b-bf90-65a59438d9b6\") " pod="openstack/nova-scheduler-0" Dec 05 23:41:18 crc kubenswrapper[4734]: I1205 23:41:18.861407 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plzcq\" (UniqueName: \"kubernetes.io/projected/adf827fa-f9ac-4c4b-bf90-65a59438d9b6-kube-api-access-plzcq\") pod \"nova-scheduler-0\" (UID: \"adf827fa-f9ac-4c4b-bf90-65a59438d9b6\") " pod="openstack/nova-scheduler-0" Dec 05 23:41:19 crc kubenswrapper[4734]: I1205 23:41:19.095817 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 23:41:19 crc kubenswrapper[4734]: I1205 23:41:19.555471 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 23:41:19 crc kubenswrapper[4734]: W1205 23:41:19.562398 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadf827fa_f9ac_4c4b_bf90_65a59438d9b6.slice/crio-1dd9f085d37d6ebe5383e22604ab3413f99ab061b563704a3d3c272fb2bfcc18 WatchSource:0}: Error finding container 1dd9f085d37d6ebe5383e22604ab3413f99ab061b563704a3d3c272fb2bfcc18: Status 404 returned error can't find the container with id 1dd9f085d37d6ebe5383e22604ab3413f99ab061b563704a3d3c272fb2bfcc18 Dec 05 23:41:19 crc kubenswrapper[4734]: I1205 23:41:19.632701 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f5a50fc-e01d-43e6-9c99-9e1693246981" path="/var/lib/kubelet/pods/4f5a50fc-e01d-43e6-9c99-9e1693246981/volumes" Dec 05 23:41:19 crc kubenswrapper[4734]: I1205 23:41:19.633342 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe" path="/var/lib/kubelet/pods/5df0dcaa-3a9d-4f8a-a97e-f59067dc0ffe/volumes" Dec 05 23:41:19 crc kubenswrapper[4734]: I1205 23:41:19.634034 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"adf827fa-f9ac-4c4b-bf90-65a59438d9b6","Type":"ContainerStarted","Data":"1dd9f085d37d6ebe5383e22604ab3413f99ab061b563704a3d3c272fb2bfcc18"} Dec 05 23:41:19 crc kubenswrapper[4734]: I1205 23:41:19.651690 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b2236711-73b3-4063-8ca9-a349b15f26b9","Type":"ContainerStarted","Data":"0ea414b7c1caae3e32ba909cdd1c87ac566cc98790987839ea53c27c028792d2"} Dec 05 23:41:19 crc kubenswrapper[4734]: I1205 23:41:19.651745 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b2236711-73b3-4063-8ca9-a349b15f26b9","Type":"ContainerStarted","Data":"24ccf66acfff43efc13998c653794fbaed143e80efaddd27a4a1738ad57f1e7b"} Dec 05 23:41:19 crc kubenswrapper[4734]: I1205 23:41:19.651757 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b2236711-73b3-4063-8ca9-a349b15f26b9","Type":"ContainerStarted","Data":"73e26de52199e56ee293ea9cc1f8b8542294c11710a7a5be8d55e4756227e976"} Dec 05 23:41:19 crc kubenswrapper[4734]: I1205 23:41:19.693105 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.693072372 podStartE2EDuration="2.693072372s" podCreationTimestamp="2025-12-05 23:41:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:41:19.688515131 +0000 UTC m=+1300.371919427" watchObservedRunningTime="2025-12-05 23:41:19.693072372 +0000 UTC m=+1300.376476668" Dec 05 23:41:20 crc kubenswrapper[4734]: I1205 23:41:20.665505 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"adf827fa-f9ac-4c4b-bf90-65a59438d9b6","Type":"ContainerStarted","Data":"902eb3e8ddd9fa5489a4816e63c558c1da48727d14669e097fef551dabc0b6da"} Dec 05 23:41:20 crc kubenswrapper[4734]: I1205 23:41:20.697436 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.6974097500000003 podStartE2EDuration="2.69740975s" podCreationTimestamp="2025-12-05 23:41:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:41:20.6895808 +0000 UTC m=+1301.372985086" watchObservedRunningTime="2025-12-05 23:41:20.69740975 +0000 UTC m=+1301.380814026" Dec 05 23:41:20 crc kubenswrapper[4734]: I1205 23:41:20.997501 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 23:41:20 crc kubenswrapper[4734]: I1205 23:41:20.997580 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 23:41:22 crc kubenswrapper[4734]: I1205 23:41:22.935337 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 05 23:41:24 crc kubenswrapper[4734]: I1205 23:41:24.096285 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 23:41:25 crc kubenswrapper[4734]: I1205 23:41:25.996995 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 23:41:25 crc kubenswrapper[4734]: I1205 23:41:25.997459 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 23:41:27 crc kubenswrapper[4734]: I1205 23:41:27.009897 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1f9ea4e7-3820-41f2-9232-a79f9f1091ac" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 23:41:27 crc kubenswrapper[4734]: I1205 23:41:27.009956 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1f9ea4e7-3820-41f2-9232-a79f9f1091ac" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 23:41:28 crc kubenswrapper[4734]: I1205 23:41:28.137283 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 23:41:28 crc kubenswrapper[4734]: I1205 23:41:28.137964 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 23:41:29 crc kubenswrapper[4734]: I1205 23:41:29.096487 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 23:41:29 crc kubenswrapper[4734]: I1205 23:41:29.128867 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 23:41:29 crc kubenswrapper[4734]: I1205 23:41:29.220005 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b2236711-73b3-4063-8ca9-a349b15f26b9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 23:41:29 crc kubenswrapper[4734]: I1205 23:41:29.220019 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b2236711-73b3-4063-8ca9-a349b15f26b9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 23:41:29 crc kubenswrapper[4734]: I1205 23:41:29.793721 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 23:41:36 crc kubenswrapper[4734]: I1205 23:41:36.004194 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 23:41:36 crc kubenswrapper[4734]: I1205 23:41:36.005329 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 23:41:36 crc kubenswrapper[4734]: I1205 23:41:36.011104 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 23:41:36 crc kubenswrapper[4734]: I1205 23:41:36.013980 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 23:41:36 crc kubenswrapper[4734]: I1205 23:41:36.754162 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:41:36 crc kubenswrapper[4734]: I1205 23:41:36.836097 4734 generic.go:334] "Generic (PLEG): container finished" podID="47c53a21-cffa-4f1f-8379-0dc6d805bc99" containerID="25cd2e727ef56bc8b06e59a125ebf10af81582d3e0e479cb7a45294050547b08" exitCode=137 Dec 05 23:41:36 crc kubenswrapper[4734]: I1205 23:41:36.836164 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"47c53a21-cffa-4f1f-8379-0dc6d805bc99","Type":"ContainerDied","Data":"25cd2e727ef56bc8b06e59a125ebf10af81582d3e0e479cb7a45294050547b08"} Dec 05 23:41:36 crc kubenswrapper[4734]: I1205 23:41:36.836230 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"47c53a21-cffa-4f1f-8379-0dc6d805bc99","Type":"ContainerDied","Data":"68383d03cf73c839d17054de97d5bcafe601670ac62f95cc9815b939b30a0bb4"} Dec 05 23:41:36 crc kubenswrapper[4734]: I1205 23:41:36.836258 4734 scope.go:117] "RemoveContainer" containerID="25cd2e727ef56bc8b06e59a125ebf10af81582d3e0e479cb7a45294050547b08" Dec 05 23:41:36 crc kubenswrapper[4734]: I1205 23:41:36.836246 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:41:36 crc kubenswrapper[4734]: I1205 23:41:36.862248 4734 scope.go:117] "RemoveContainer" containerID="25cd2e727ef56bc8b06e59a125ebf10af81582d3e0e479cb7a45294050547b08" Dec 05 23:41:36 crc kubenswrapper[4734]: E1205 23:41:36.863098 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25cd2e727ef56bc8b06e59a125ebf10af81582d3e0e479cb7a45294050547b08\": container with ID starting with 25cd2e727ef56bc8b06e59a125ebf10af81582d3e0e479cb7a45294050547b08 not found: ID does not exist" containerID="25cd2e727ef56bc8b06e59a125ebf10af81582d3e0e479cb7a45294050547b08" Dec 05 23:41:36 crc kubenswrapper[4734]: I1205 23:41:36.863154 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25cd2e727ef56bc8b06e59a125ebf10af81582d3e0e479cb7a45294050547b08"} err="failed to get container status \"25cd2e727ef56bc8b06e59a125ebf10af81582d3e0e479cb7a45294050547b08\": rpc error: code = NotFound desc = could not find container \"25cd2e727ef56bc8b06e59a125ebf10af81582d3e0e479cb7a45294050547b08\": container with ID starting with 25cd2e727ef56bc8b06e59a125ebf10af81582d3e0e479cb7a45294050547b08 not found: ID does not exist" Dec 05 23:41:36 crc kubenswrapper[4734]: I1205 23:41:36.894034 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47c53a21-cffa-4f1f-8379-0dc6d805bc99-config-data\") pod \"47c53a21-cffa-4f1f-8379-0dc6d805bc99\" (UID: \"47c53a21-cffa-4f1f-8379-0dc6d805bc99\") " Dec 05 23:41:36 crc kubenswrapper[4734]: I1205 23:41:36.894134 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47c53a21-cffa-4f1f-8379-0dc6d805bc99-combined-ca-bundle\") pod \"47c53a21-cffa-4f1f-8379-0dc6d805bc99\" (UID: \"47c53a21-cffa-4f1f-8379-0dc6d805bc99\") " Dec 05 23:41:36 crc kubenswrapper[4734]: I1205 23:41:36.894246 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th6lp\" (UniqueName: \"kubernetes.io/projected/47c53a21-cffa-4f1f-8379-0dc6d805bc99-kube-api-access-th6lp\") pod \"47c53a21-cffa-4f1f-8379-0dc6d805bc99\" (UID: \"47c53a21-cffa-4f1f-8379-0dc6d805bc99\") " Dec 05 23:41:36 crc kubenswrapper[4734]: I1205 23:41:36.903488 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47c53a21-cffa-4f1f-8379-0dc6d805bc99-kube-api-access-th6lp" (OuterVolumeSpecName: "kube-api-access-th6lp") pod "47c53a21-cffa-4f1f-8379-0dc6d805bc99" (UID: "47c53a21-cffa-4f1f-8379-0dc6d805bc99"). InnerVolumeSpecName "kube-api-access-th6lp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:41:36 crc kubenswrapper[4734]: I1205 23:41:36.928576 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47c53a21-cffa-4f1f-8379-0dc6d805bc99-config-data" (OuterVolumeSpecName: "config-data") pod "47c53a21-cffa-4f1f-8379-0dc6d805bc99" (UID: "47c53a21-cffa-4f1f-8379-0dc6d805bc99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:41:36 crc kubenswrapper[4734]: I1205 23:41:36.929273 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47c53a21-cffa-4f1f-8379-0dc6d805bc99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47c53a21-cffa-4f1f-8379-0dc6d805bc99" (UID: "47c53a21-cffa-4f1f-8379-0dc6d805bc99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:41:36 crc kubenswrapper[4734]: I1205 23:41:36.997611 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47c53a21-cffa-4f1f-8379-0dc6d805bc99-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:36 crc kubenswrapper[4734]: I1205 23:41:36.998350 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47c53a21-cffa-4f1f-8379-0dc6d805bc99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:36 crc kubenswrapper[4734]: I1205 23:41:36.998387 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-th6lp\" (UniqueName: \"kubernetes.io/projected/47c53a21-cffa-4f1f-8379-0dc6d805bc99-kube-api-access-th6lp\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:37 crc kubenswrapper[4734]: I1205 23:41:37.180796 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 23:41:37 crc kubenswrapper[4734]: I1205 23:41:37.199514 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 23:41:37 crc kubenswrapper[4734]: I1205 23:41:37.210075 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 23:41:37 crc kubenswrapper[4734]: E1205 23:41:37.210597 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47c53a21-cffa-4f1f-8379-0dc6d805bc99" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 23:41:37 crc kubenswrapper[4734]: I1205 23:41:37.210617 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c53a21-cffa-4f1f-8379-0dc6d805bc99" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 23:41:37 crc kubenswrapper[4734]: I1205 23:41:37.210842 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="47c53a21-cffa-4f1f-8379-0dc6d805bc99" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 23:41:37 crc kubenswrapper[4734]: I1205 23:41:37.211708 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:41:37 crc kubenswrapper[4734]: I1205 23:41:37.214169 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 05 23:41:37 crc kubenswrapper[4734]: I1205 23:41:37.214653 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 05 23:41:37 crc kubenswrapper[4734]: I1205 23:41:37.215706 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 05 23:41:37 crc kubenswrapper[4734]: I1205 23:41:37.222884 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 23:41:37 crc kubenswrapper[4734]: I1205 23:41:37.407975 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/41bee178-e2d7-4047-9c0a-429dc21411ed-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"41bee178-e2d7-4047-9c0a-429dc21411ed\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:41:37 crc kubenswrapper[4734]: I1205 23:41:37.408164 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41bee178-e2d7-4047-9c0a-429dc21411ed-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"41bee178-e2d7-4047-9c0a-429dc21411ed\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:41:37 crc kubenswrapper[4734]: I1205 23:41:37.408374 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/41bee178-e2d7-4047-9c0a-429dc21411ed-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"41bee178-e2d7-4047-9c0a-429dc21411ed\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:41:37 crc kubenswrapper[4734]: I1205 23:41:37.408444 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5d6w\" (UniqueName: \"kubernetes.io/projected/41bee178-e2d7-4047-9c0a-429dc21411ed-kube-api-access-c5d6w\") pod \"nova-cell1-novncproxy-0\" (UID: \"41bee178-e2d7-4047-9c0a-429dc21411ed\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:41:37 crc kubenswrapper[4734]: I1205 23:41:37.408496 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41bee178-e2d7-4047-9c0a-429dc21411ed-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"41bee178-e2d7-4047-9c0a-429dc21411ed\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:41:37 crc kubenswrapper[4734]: I1205 23:41:37.511786 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41bee178-e2d7-4047-9c0a-429dc21411ed-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"41bee178-e2d7-4047-9c0a-429dc21411ed\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:41:37 crc kubenswrapper[4734]: I1205 23:41:37.511968 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/41bee178-e2d7-4047-9c0a-429dc21411ed-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"41bee178-e2d7-4047-9c0a-429dc21411ed\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:41:37 crc kubenswrapper[4734]: I1205 23:41:37.512011 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5d6w\" (UniqueName: \"kubernetes.io/projected/41bee178-e2d7-4047-9c0a-429dc21411ed-kube-api-access-c5d6w\") pod \"nova-cell1-novncproxy-0\" (UID: \"41bee178-e2d7-4047-9c0a-429dc21411ed\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:41:37 crc kubenswrapper[4734]: I1205 23:41:37.512092 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41bee178-e2d7-4047-9c0a-429dc21411ed-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"41bee178-e2d7-4047-9c0a-429dc21411ed\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:41:37 crc kubenswrapper[4734]: I1205 23:41:37.512923 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/41bee178-e2d7-4047-9c0a-429dc21411ed-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"41bee178-e2d7-4047-9c0a-429dc21411ed\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:41:37 crc kubenswrapper[4734]: I1205 23:41:37.517762 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/41bee178-e2d7-4047-9c0a-429dc21411ed-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"41bee178-e2d7-4047-9c0a-429dc21411ed\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:41:37 crc kubenswrapper[4734]: I1205 23:41:37.518201 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41bee178-e2d7-4047-9c0a-429dc21411ed-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"41bee178-e2d7-4047-9c0a-429dc21411ed\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:41:37 crc kubenswrapper[4734]: I1205 23:41:37.518644 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41bee178-e2d7-4047-9c0a-429dc21411ed-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"41bee178-e2d7-4047-9c0a-429dc21411ed\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:41:37 crc kubenswrapper[4734]: I1205 23:41:37.519501 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/41bee178-e2d7-4047-9c0a-429dc21411ed-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"41bee178-e2d7-4047-9c0a-429dc21411ed\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:41:37 crc kubenswrapper[4734]: I1205 23:41:37.531494 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5d6w\" (UniqueName: \"kubernetes.io/projected/41bee178-e2d7-4047-9c0a-429dc21411ed-kube-api-access-c5d6w\") pod \"nova-cell1-novncproxy-0\" (UID: \"41bee178-e2d7-4047-9c0a-429dc21411ed\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:41:37 crc kubenswrapper[4734]: I1205 23:41:37.542999 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:41:37 crc kubenswrapper[4734]: I1205 23:41:37.637264 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47c53a21-cffa-4f1f-8379-0dc6d805bc99" path="/var/lib/kubelet/pods/47c53a21-cffa-4f1f-8379-0dc6d805bc99/volumes" Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.012251 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.141963 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.142535 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.143178 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.143622 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.147003 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.147735 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.394118 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-tklww"] Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.396591 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.405494 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-tklww"] Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.544420 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-tklww\" (UID: \"e4da9f35-b56d-47e7-9492-6e9379754584\") " pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.544652 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-tklww\" (UID: \"e4da9f35-b56d-47e7-9492-6e9379754584\") " pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.544851 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-tklww\" (UID: \"e4da9f35-b56d-47e7-9492-6e9379754584\") " pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.545008 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-tklww\" (UID: \"e4da9f35-b56d-47e7-9492-6e9379754584\") " pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.545149 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blrh8\" (UniqueName: \"kubernetes.io/projected/e4da9f35-b56d-47e7-9492-6e9379754584-kube-api-access-blrh8\") pod \"dnsmasq-dns-89c5cd4d5-tklww\" (UID: \"e4da9f35-b56d-47e7-9492-6e9379754584\") " pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.545183 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-config\") pod \"dnsmasq-dns-89c5cd4d5-tklww\" (UID: \"e4da9f35-b56d-47e7-9492-6e9379754584\") " pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.647070 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-tklww\" (UID: \"e4da9f35-b56d-47e7-9492-6e9379754584\") " pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.647181 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-tklww\" (UID: \"e4da9f35-b56d-47e7-9492-6e9379754584\") " pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.647256 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-tklww\" (UID: \"e4da9f35-b56d-47e7-9492-6e9379754584\") " pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.647309 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-tklww\" (UID: \"e4da9f35-b56d-47e7-9492-6e9379754584\") " pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.647369 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blrh8\" (UniqueName: \"kubernetes.io/projected/e4da9f35-b56d-47e7-9492-6e9379754584-kube-api-access-blrh8\") pod \"dnsmasq-dns-89c5cd4d5-tklww\" (UID: \"e4da9f35-b56d-47e7-9492-6e9379754584\") " pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.647391 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-config\") pod \"dnsmasq-dns-89c5cd4d5-tklww\" (UID: \"e4da9f35-b56d-47e7-9492-6e9379754584\") " pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.648639 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-tklww\" (UID: \"e4da9f35-b56d-47e7-9492-6e9379754584\") " pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.649026 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-tklww\" (UID: \"e4da9f35-b56d-47e7-9492-6e9379754584\") " pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.649223 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-tklww\" (UID: \"e4da9f35-b56d-47e7-9492-6e9379754584\") " pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.649362 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-config\") pod \"dnsmasq-dns-89c5cd4d5-tklww\" (UID: \"e4da9f35-b56d-47e7-9492-6e9379754584\") " pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.649455 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-tklww\" (UID: \"e4da9f35-b56d-47e7-9492-6e9379754584\") " pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.674655 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blrh8\" (UniqueName: \"kubernetes.io/projected/e4da9f35-b56d-47e7-9492-6e9379754584-kube-api-access-blrh8\") pod \"dnsmasq-dns-89c5cd4d5-tklww\" (UID: \"e4da9f35-b56d-47e7-9492-6e9379754584\") " pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.725926 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.890805 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"41bee178-e2d7-4047-9c0a-429dc21411ed","Type":"ContainerStarted","Data":"4ed625627fb2f4ccfd73888627adb35320ffb8510eed50d1258663001289daf2"} Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.890876 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"41bee178-e2d7-4047-9c0a-429dc21411ed","Type":"ContainerStarted","Data":"463be94e45a4fb16d84af3922bdadccd87ecab5091f497d5b7dcf3b62fc1a2c8"} Dec 05 23:41:38 crc kubenswrapper[4734]: I1205 23:41:38.925585 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.9255508890000002 podStartE2EDuration="1.925550889s" podCreationTimestamp="2025-12-05 23:41:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:41:38.918867377 +0000 UTC m=+1319.602271663" watchObservedRunningTime="2025-12-05 23:41:38.925550889 +0000 UTC m=+1319.608955185" Dec 05 23:41:39 crc kubenswrapper[4734]: I1205 23:41:39.261796 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-tklww"] Dec 05 23:41:39 crc kubenswrapper[4734]: I1205 23:41:39.904609 4734 generic.go:334] "Generic (PLEG): container finished" podID="e4da9f35-b56d-47e7-9492-6e9379754584" containerID="f00caf4b47d9d95e2c7019ceb937384399123fab1923f9aa32c7a80977fe948c" exitCode=0 Dec 05 23:41:39 crc kubenswrapper[4734]: I1205 23:41:39.906493 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" event={"ID":"e4da9f35-b56d-47e7-9492-6e9379754584","Type":"ContainerDied","Data":"f00caf4b47d9d95e2c7019ceb937384399123fab1923f9aa32c7a80977fe948c"} Dec 05 23:41:39 crc kubenswrapper[4734]: I1205 23:41:39.906553 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" event={"ID":"e4da9f35-b56d-47e7-9492-6e9379754584","Type":"ContainerStarted","Data":"4fda1812c6502facdd95bfe28189f58aa8947cb0c5d7c8f6bb20ec5f1fb3ad4a"} Dec 05 23:41:40 crc kubenswrapper[4734]: I1205 23:41:40.920696 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" event={"ID":"e4da9f35-b56d-47e7-9492-6e9379754584","Type":"ContainerStarted","Data":"f0806da6272183f3f8a186f6210f91fca2724e1b742f689684f3bc5b212b57c6"} Dec 05 23:41:40 crc kubenswrapper[4734]: I1205 23:41:40.924069 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" Dec 05 23:41:40 crc kubenswrapper[4734]: I1205 23:41:40.980099 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" podStartSLOduration=2.980064058 podStartE2EDuration="2.980064058s" podCreationTimestamp="2025-12-05 23:41:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:41:40.94593955 +0000 UTC m=+1321.629343826" watchObservedRunningTime="2025-12-05 23:41:40.980064058 +0000 UTC m=+1321.663468334" Dec 05 23:41:41 crc kubenswrapper[4734]: I1205 23:41:41.074572 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:41:41 crc kubenswrapper[4734]: I1205 23:41:41.074961 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" containerName="ceilometer-central-agent" containerID="cri-o://c8c1e98a40bc0f96812dadcd4575b203ee7c79b647737eccd99e82658efaefa0" gracePeriod=30 Dec 05 23:41:41 crc kubenswrapper[4734]: I1205 23:41:41.075076 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" containerName="proxy-httpd" containerID="cri-o://84ec79b03026fd94534cface387c534591afdd4ae3aec1de79fdaa597d333e19" gracePeriod=30 Dec 05 23:41:41 crc kubenswrapper[4734]: I1205 23:41:41.075100 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" containerName="ceilometer-notification-agent" containerID="cri-o://d4d1983b58b5522a54ba87ff322862b39b4506725b8a45764be4aa4dabad2e59" gracePeriod=30 Dec 05 23:41:41 crc kubenswrapper[4734]: I1205 23:41:41.075102 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" containerName="sg-core" containerID="cri-o://85416fe942405d9f868ee60ff03406a7c963a8eb9655963149f5a25baf3ac145" gracePeriod=30 Dec 05 23:41:41 crc kubenswrapper[4734]: I1205 23:41:41.240404 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 23:41:41 crc kubenswrapper[4734]: I1205 23:41:41.240727 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b2236711-73b3-4063-8ca9-a349b15f26b9" containerName="nova-api-log" containerID="cri-o://24ccf66acfff43efc13998c653794fbaed143e80efaddd27a4a1738ad57f1e7b" gracePeriod=30 Dec 05 23:41:41 crc kubenswrapper[4734]: I1205 23:41:41.241045 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b2236711-73b3-4063-8ca9-a349b15f26b9" containerName="nova-api-api" containerID="cri-o://0ea414b7c1caae3e32ba909cdd1c87ac566cc98790987839ea53c27c028792d2" gracePeriod=30 Dec 05 23:41:41 crc kubenswrapper[4734]: I1205 23:41:41.939510 4734 generic.go:334] "Generic (PLEG): container finished" podID="b2236711-73b3-4063-8ca9-a349b15f26b9" containerID="24ccf66acfff43efc13998c653794fbaed143e80efaddd27a4a1738ad57f1e7b" exitCode=143 Dec 05 23:41:41 crc kubenswrapper[4734]: I1205 23:41:41.939851 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b2236711-73b3-4063-8ca9-a349b15f26b9","Type":"ContainerDied","Data":"24ccf66acfff43efc13998c653794fbaed143e80efaddd27a4a1738ad57f1e7b"} Dec 05 23:41:41 crc kubenswrapper[4734]: I1205 23:41:41.947884 4734 generic.go:334] "Generic (PLEG): container finished" podID="0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" containerID="84ec79b03026fd94534cface387c534591afdd4ae3aec1de79fdaa597d333e19" exitCode=0 Dec 05 23:41:41 crc kubenswrapper[4734]: I1205 23:41:41.947913 4734 generic.go:334] "Generic (PLEG): container finished" podID="0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" containerID="85416fe942405d9f868ee60ff03406a7c963a8eb9655963149f5a25baf3ac145" exitCode=2 Dec 05 23:41:41 crc kubenswrapper[4734]: I1205 23:41:41.947922 4734 generic.go:334] "Generic (PLEG): container finished" podID="0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" containerID="c8c1e98a40bc0f96812dadcd4575b203ee7c79b647737eccd99e82658efaefa0" exitCode=0 Dec 05 23:41:41 crc kubenswrapper[4734]: I1205 23:41:41.949042 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e","Type":"ContainerDied","Data":"84ec79b03026fd94534cface387c534591afdd4ae3aec1de79fdaa597d333e19"} Dec 05 23:41:41 crc kubenswrapper[4734]: I1205 23:41:41.949087 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e","Type":"ContainerDied","Data":"85416fe942405d9f868ee60ff03406a7c963a8eb9655963149f5a25baf3ac145"} Dec 05 23:41:41 crc kubenswrapper[4734]: I1205 23:41:41.949102 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e","Type":"ContainerDied","Data":"c8c1e98a40bc0f96812dadcd4575b203ee7c79b647737eccd99e82658efaefa0"} Dec 05 23:41:42 crc kubenswrapper[4734]: I1205 23:41:42.543703 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:41:44 crc kubenswrapper[4734]: I1205 23:41:44.985775 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 23:41:44 crc kubenswrapper[4734]: I1205 23:41:44.995356 4734 generic.go:334] "Generic (PLEG): container finished" podID="b2236711-73b3-4063-8ca9-a349b15f26b9" containerID="0ea414b7c1caae3e32ba909cdd1c87ac566cc98790987839ea53c27c028792d2" exitCode=0 Dec 05 23:41:44 crc kubenswrapper[4734]: I1205 23:41:44.995460 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b2236711-73b3-4063-8ca9-a349b15f26b9","Type":"ContainerDied","Data":"0ea414b7c1caae3e32ba909cdd1c87ac566cc98790987839ea53c27c028792d2"} Dec 05 23:41:44 crc kubenswrapper[4734]: I1205 23:41:44.995695 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b2236711-73b3-4063-8ca9-a349b15f26b9","Type":"ContainerDied","Data":"73e26de52199e56ee293ea9cc1f8b8542294c11710a7a5be8d55e4756227e976"} Dec 05 23:41:44 crc kubenswrapper[4734]: I1205 23:41:44.995753 4734 scope.go:117] "RemoveContainer" containerID="0ea414b7c1caae3e32ba909cdd1c87ac566cc98790987839ea53c27c028792d2" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.007197 4734 generic.go:334] "Generic (PLEG): container finished" podID="0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" containerID="d4d1983b58b5522a54ba87ff322862b39b4506725b8a45764be4aa4dabad2e59" exitCode=0 Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.007272 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e","Type":"ContainerDied","Data":"d4d1983b58b5522a54ba87ff322862b39b4506725b8a45764be4aa4dabad2e59"} Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.016601 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7lgm\" (UniqueName: \"kubernetes.io/projected/b2236711-73b3-4063-8ca9-a349b15f26b9-kube-api-access-w7lgm\") pod \"b2236711-73b3-4063-8ca9-a349b15f26b9\" (UID: \"b2236711-73b3-4063-8ca9-a349b15f26b9\") " Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.016697 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2236711-73b3-4063-8ca9-a349b15f26b9-combined-ca-bundle\") pod \"b2236711-73b3-4063-8ca9-a349b15f26b9\" (UID: \"b2236711-73b3-4063-8ca9-a349b15f26b9\") " Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.040036 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2236711-73b3-4063-8ca9-a349b15f26b9-kube-api-access-w7lgm" (OuterVolumeSpecName: "kube-api-access-w7lgm") pod "b2236711-73b3-4063-8ca9-a349b15f26b9" (UID: "b2236711-73b3-4063-8ca9-a349b15f26b9"). InnerVolumeSpecName "kube-api-access-w7lgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.059330 4734 scope.go:117] "RemoveContainer" containerID="24ccf66acfff43efc13998c653794fbaed143e80efaddd27a4a1738ad57f1e7b" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.103475 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2236711-73b3-4063-8ca9-a349b15f26b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2236711-73b3-4063-8ca9-a349b15f26b9" (UID: "b2236711-73b3-4063-8ca9-a349b15f26b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.119108 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2236711-73b3-4063-8ca9-a349b15f26b9-logs\") pod \"b2236711-73b3-4063-8ca9-a349b15f26b9\" (UID: \"b2236711-73b3-4063-8ca9-a349b15f26b9\") " Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.119453 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2236711-73b3-4063-8ca9-a349b15f26b9-config-data\") pod \"b2236711-73b3-4063-8ca9-a349b15f26b9\" (UID: \"b2236711-73b3-4063-8ca9-a349b15f26b9\") " Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.119754 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2236711-73b3-4063-8ca9-a349b15f26b9-logs" (OuterVolumeSpecName: "logs") pod "b2236711-73b3-4063-8ca9-a349b15f26b9" (UID: "b2236711-73b3-4063-8ca9-a349b15f26b9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.120324 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7lgm\" (UniqueName: \"kubernetes.io/projected/b2236711-73b3-4063-8ca9-a349b15f26b9-kube-api-access-w7lgm\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.120346 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2236711-73b3-4063-8ca9-a349b15f26b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.120359 4734 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2236711-73b3-4063-8ca9-a349b15f26b9-logs\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.163892 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2236711-73b3-4063-8ca9-a349b15f26b9-config-data" (OuterVolumeSpecName: "config-data") pod "b2236711-73b3-4063-8ca9-a349b15f26b9" (UID: "b2236711-73b3-4063-8ca9-a349b15f26b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.180760 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.185832 4734 scope.go:117] "RemoveContainer" containerID="0ea414b7c1caae3e32ba909cdd1c87ac566cc98790987839ea53c27c028792d2" Dec 05 23:41:45 crc kubenswrapper[4734]: E1205 23:41:45.186838 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ea414b7c1caae3e32ba909cdd1c87ac566cc98790987839ea53c27c028792d2\": container with ID starting with 0ea414b7c1caae3e32ba909cdd1c87ac566cc98790987839ea53c27c028792d2 not found: ID does not exist" containerID="0ea414b7c1caae3e32ba909cdd1c87ac566cc98790987839ea53c27c028792d2" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.186880 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ea414b7c1caae3e32ba909cdd1c87ac566cc98790987839ea53c27c028792d2"} err="failed to get container status \"0ea414b7c1caae3e32ba909cdd1c87ac566cc98790987839ea53c27c028792d2\": rpc error: code = NotFound desc = could not find container \"0ea414b7c1caae3e32ba909cdd1c87ac566cc98790987839ea53c27c028792d2\": container with ID starting with 0ea414b7c1caae3e32ba909cdd1c87ac566cc98790987839ea53c27c028792d2 not found: ID does not exist" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.186926 4734 scope.go:117] "RemoveContainer" containerID="24ccf66acfff43efc13998c653794fbaed143e80efaddd27a4a1738ad57f1e7b" Dec 05 23:41:45 crc kubenswrapper[4734]: E1205 23:41:45.187371 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24ccf66acfff43efc13998c653794fbaed143e80efaddd27a4a1738ad57f1e7b\": container with ID starting with 24ccf66acfff43efc13998c653794fbaed143e80efaddd27a4a1738ad57f1e7b not found: ID does not exist" containerID="24ccf66acfff43efc13998c653794fbaed143e80efaddd27a4a1738ad57f1e7b" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.187427 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24ccf66acfff43efc13998c653794fbaed143e80efaddd27a4a1738ad57f1e7b"} err="failed to get container status \"24ccf66acfff43efc13998c653794fbaed143e80efaddd27a4a1738ad57f1e7b\": rpc error: code = NotFound desc = could not find container \"24ccf66acfff43efc13998c653794fbaed143e80efaddd27a4a1738ad57f1e7b\": container with ID starting with 24ccf66acfff43efc13998c653794fbaed143e80efaddd27a4a1738ad57f1e7b not found: ID does not exist" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.222735 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2236711-73b3-4063-8ca9-a349b15f26b9-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.324719 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-ceilometer-tls-certs\") pod \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.324868 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-config-data\") pod \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.324980 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-log-httpd\") pod \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.325027 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-combined-ca-bundle\") pod \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.325096 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9w8x\" (UniqueName: \"kubernetes.io/projected/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-kube-api-access-s9w8x\") pod \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.325156 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-sg-core-conf-yaml\") pod \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.325199 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-run-httpd\") pod \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.325475 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-scripts\") pod \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\" (UID: \"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e\") " Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.325658 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" (UID: "0e1a3779-44d7-4c3a-89ee-db6e17bcd42e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.326115 4734 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.326802 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" (UID: "0e1a3779-44d7-4c3a-89ee-db6e17bcd42e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.329858 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-kube-api-access-s9w8x" (OuterVolumeSpecName: "kube-api-access-s9w8x") pod "0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" (UID: "0e1a3779-44d7-4c3a-89ee-db6e17bcd42e"). InnerVolumeSpecName "kube-api-access-s9w8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.333769 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-scripts" (OuterVolumeSpecName: "scripts") pod "0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" (UID: "0e1a3779-44d7-4c3a-89ee-db6e17bcd42e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.373750 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" (UID: "0e1a3779-44d7-4c3a-89ee-db6e17bcd42e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.416823 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" (UID: "0e1a3779-44d7-4c3a-89ee-db6e17bcd42e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.429691 4734 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.429763 4734 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.429787 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9w8x\" (UniqueName: \"kubernetes.io/projected/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-kube-api-access-s9w8x\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.429805 4734 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.429819 4734 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.452051 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" (UID: "0e1a3779-44d7-4c3a-89ee-db6e17bcd42e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.479109 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-config-data" (OuterVolumeSpecName: "config-data") pod "0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" (UID: "0e1a3779-44d7-4c3a-89ee-db6e17bcd42e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.532629 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:45 crc kubenswrapper[4734]: I1205 23:41:45.532686 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.018756 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.025680 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e1a3779-44d7-4c3a-89ee-db6e17bcd42e","Type":"ContainerDied","Data":"3b194672658ec89132d26912cc69aabdbbace67b13ff27532e8d98de6ec4eade"} Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.025744 4734 scope.go:117] "RemoveContainer" containerID="84ec79b03026fd94534cface387c534591afdd4ae3aec1de79fdaa597d333e19" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.025762 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.058635 4734 scope.go:117] "RemoveContainer" containerID="85416fe942405d9f868ee60ff03406a7c963a8eb9655963149f5a25baf3ac145" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.138118 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.163555 4734 scope.go:117] "RemoveContainer" containerID="d4d1983b58b5522a54ba87ff322862b39b4506725b8a45764be4aa4dabad2e59" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.174731 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.208190 4734 scope.go:117] "RemoveContainer" containerID="c8c1e98a40bc0f96812dadcd4575b203ee7c79b647737eccd99e82658efaefa0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.218434 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.244675 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 23:41:46 crc kubenswrapper[4734]: E1205 23:41:46.245186 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2236711-73b3-4063-8ca9-a349b15f26b9" containerName="nova-api-log" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.245208 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2236711-73b3-4063-8ca9-a349b15f26b9" containerName="nova-api-log" Dec 05 23:41:46 crc kubenswrapper[4734]: E1205 23:41:46.245224 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" containerName="ceilometer-central-agent" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.245232 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" containerName="ceilometer-central-agent" Dec 05 23:41:46 crc kubenswrapper[4734]: E1205 23:41:46.245248 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" containerName="proxy-httpd" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.245255 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" containerName="proxy-httpd" Dec 05 23:41:46 crc kubenswrapper[4734]: E1205 23:41:46.245272 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" containerName="sg-core" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.245278 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" containerName="sg-core" Dec 05 23:41:46 crc kubenswrapper[4734]: E1205 23:41:46.245298 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2236711-73b3-4063-8ca9-a349b15f26b9" containerName="nova-api-api" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.245305 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2236711-73b3-4063-8ca9-a349b15f26b9" containerName="nova-api-api" Dec 05 23:41:46 crc kubenswrapper[4734]: E1205 23:41:46.245321 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" containerName="ceilometer-notification-agent" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.245329 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" containerName="ceilometer-notification-agent" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.245556 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" containerName="ceilometer-notification-agent" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.245569 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2236711-73b3-4063-8ca9-a349b15f26b9" containerName="nova-api-log" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.245586 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2236711-73b3-4063-8ca9-a349b15f26b9" containerName="nova-api-api" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.245595 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" containerName="sg-core" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.245603 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" containerName="proxy-httpd" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.245616 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" containerName="ceilometer-central-agent" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.246855 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.249845 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.253497 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.253758 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.253879 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.267050 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.279437 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.282271 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.284575 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.284629 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.286418 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.288871 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.361589 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-config-data\") pod \"nova-api-0\" (UID: \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\") " pod="openstack/nova-api-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.361709 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\") " pod="openstack/nova-api-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.361741 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcwgw\" (UniqueName: \"kubernetes.io/projected/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-kube-api-access-zcwgw\") pod \"nova-api-0\" (UID: \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\") " pod="openstack/nova-api-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.361787 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-internal-tls-certs\") pod \"nova-api-0\" (UID: \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\") " pod="openstack/nova-api-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.361822 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-public-tls-certs\") pod \"nova-api-0\" (UID: \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\") " pod="openstack/nova-api-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.361878 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-logs\") pod \"nova-api-0\" (UID: \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\") " pod="openstack/nova-api-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.464209 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/90a03731-2e0d-4698-a55e-0af3ef5372be-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"90a03731-2e0d-4698-a55e-0af3ef5372be\") " pod="openstack/ceilometer-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.464303 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90a03731-2e0d-4698-a55e-0af3ef5372be-scripts\") pod \"ceilometer-0\" (UID: \"90a03731-2e0d-4698-a55e-0af3ef5372be\") " pod="openstack/ceilometer-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.464331 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90a03731-2e0d-4698-a55e-0af3ef5372be-log-httpd\") pod \"ceilometer-0\" (UID: \"90a03731-2e0d-4698-a55e-0af3ef5372be\") " pod="openstack/ceilometer-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.464362 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\") " pod="openstack/nova-api-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.464398 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcwgw\" (UniqueName: \"kubernetes.io/projected/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-kube-api-access-zcwgw\") pod \"nova-api-0\" (UID: \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\") " pod="openstack/nova-api-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.464445 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-internal-tls-certs\") pod \"nova-api-0\" (UID: \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\") " pod="openstack/nova-api-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.464476 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-public-tls-certs\") pod \"nova-api-0\" (UID: \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\") " pod="openstack/nova-api-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.464501 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90a03731-2e0d-4698-a55e-0af3ef5372be-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"90a03731-2e0d-4698-a55e-0af3ef5372be\") " pod="openstack/ceilometer-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.464546 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/90a03731-2e0d-4698-a55e-0af3ef5372be-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"90a03731-2e0d-4698-a55e-0af3ef5372be\") " pod="openstack/ceilometer-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.464573 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90a03731-2e0d-4698-a55e-0af3ef5372be-run-httpd\") pod \"ceilometer-0\" (UID: \"90a03731-2e0d-4698-a55e-0af3ef5372be\") " pod="openstack/ceilometer-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.464607 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg88h\" (UniqueName: \"kubernetes.io/projected/90a03731-2e0d-4698-a55e-0af3ef5372be-kube-api-access-fg88h\") pod \"ceilometer-0\" (UID: \"90a03731-2e0d-4698-a55e-0af3ef5372be\") " pod="openstack/ceilometer-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.464644 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-logs\") pod \"nova-api-0\" (UID: \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\") " pod="openstack/nova-api-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.464673 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90a03731-2e0d-4698-a55e-0af3ef5372be-config-data\") pod \"ceilometer-0\" (UID: \"90a03731-2e0d-4698-a55e-0af3ef5372be\") " pod="openstack/ceilometer-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.464707 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-config-data\") pod \"nova-api-0\" (UID: \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\") " pod="openstack/nova-api-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.465714 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-logs\") pod \"nova-api-0\" (UID: \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\") " pod="openstack/nova-api-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.471477 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\") " pod="openstack/nova-api-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.471478 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-internal-tls-certs\") pod \"nova-api-0\" (UID: \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\") " pod="openstack/nova-api-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.473027 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-public-tls-certs\") pod \"nova-api-0\" (UID: \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\") " pod="openstack/nova-api-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.475477 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-config-data\") pod \"nova-api-0\" (UID: \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\") " pod="openstack/nova-api-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.486379 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcwgw\" (UniqueName: \"kubernetes.io/projected/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-kube-api-access-zcwgw\") pod \"nova-api-0\" (UID: \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\") " pod="openstack/nova-api-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.566090 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.568961 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90a03731-2e0d-4698-a55e-0af3ef5372be-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"90a03731-2e0d-4698-a55e-0af3ef5372be\") " pod="openstack/ceilometer-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.569043 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/90a03731-2e0d-4698-a55e-0af3ef5372be-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"90a03731-2e0d-4698-a55e-0af3ef5372be\") " pod="openstack/ceilometer-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.569088 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90a03731-2e0d-4698-a55e-0af3ef5372be-run-httpd\") pod \"ceilometer-0\" (UID: \"90a03731-2e0d-4698-a55e-0af3ef5372be\") " pod="openstack/ceilometer-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.569134 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg88h\" (UniqueName: \"kubernetes.io/projected/90a03731-2e0d-4698-a55e-0af3ef5372be-kube-api-access-fg88h\") pod \"ceilometer-0\" (UID: \"90a03731-2e0d-4698-a55e-0af3ef5372be\") " pod="openstack/ceilometer-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.569212 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90a03731-2e0d-4698-a55e-0af3ef5372be-config-data\") pod \"ceilometer-0\" (UID: \"90a03731-2e0d-4698-a55e-0af3ef5372be\") " pod="openstack/ceilometer-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.569281 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/90a03731-2e0d-4698-a55e-0af3ef5372be-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"90a03731-2e0d-4698-a55e-0af3ef5372be\") " pod="openstack/ceilometer-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.569361 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90a03731-2e0d-4698-a55e-0af3ef5372be-scripts\") pod \"ceilometer-0\" (UID: \"90a03731-2e0d-4698-a55e-0af3ef5372be\") " pod="openstack/ceilometer-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.569404 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90a03731-2e0d-4698-a55e-0af3ef5372be-log-httpd\") pod \"ceilometer-0\" (UID: \"90a03731-2e0d-4698-a55e-0af3ef5372be\") " pod="openstack/ceilometer-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.569891 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90a03731-2e0d-4698-a55e-0af3ef5372be-run-httpd\") pod \"ceilometer-0\" (UID: \"90a03731-2e0d-4698-a55e-0af3ef5372be\") " pod="openstack/ceilometer-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.569946 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90a03731-2e0d-4698-a55e-0af3ef5372be-log-httpd\") pod \"ceilometer-0\" (UID: \"90a03731-2e0d-4698-a55e-0af3ef5372be\") " pod="openstack/ceilometer-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.574696 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/90a03731-2e0d-4698-a55e-0af3ef5372be-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"90a03731-2e0d-4698-a55e-0af3ef5372be\") " pod="openstack/ceilometer-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.575048 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90a03731-2e0d-4698-a55e-0af3ef5372be-scripts\") pod \"ceilometer-0\" (UID: \"90a03731-2e0d-4698-a55e-0af3ef5372be\") " pod="openstack/ceilometer-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.577510 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/90a03731-2e0d-4698-a55e-0af3ef5372be-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"90a03731-2e0d-4698-a55e-0af3ef5372be\") " pod="openstack/ceilometer-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.577754 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90a03731-2e0d-4698-a55e-0af3ef5372be-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"90a03731-2e0d-4698-a55e-0af3ef5372be\") " pod="openstack/ceilometer-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.578264 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90a03731-2e0d-4698-a55e-0af3ef5372be-config-data\") pod \"ceilometer-0\" (UID: \"90a03731-2e0d-4698-a55e-0af3ef5372be\") " pod="openstack/ceilometer-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.598398 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg88h\" (UniqueName: \"kubernetes.io/projected/90a03731-2e0d-4698-a55e-0af3ef5372be-kube-api-access-fg88h\") pod \"ceilometer-0\" (UID: \"90a03731-2e0d-4698-a55e-0af3ef5372be\") " pod="openstack/ceilometer-0" Dec 05 23:41:46 crc kubenswrapper[4734]: I1205 23:41:46.612621 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 23:41:47 crc kubenswrapper[4734]: I1205 23:41:47.100567 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 23:41:47 crc kubenswrapper[4734]: I1205 23:41:47.241667 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 23:41:47 crc kubenswrapper[4734]: W1205 23:41:47.250711 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90a03731_2e0d_4698_a55e_0af3ef5372be.slice/crio-908ec4f082a6567c773cc1af387525af7bb22695f448e8921eb341cb2be89a3a WatchSource:0}: Error finding container 908ec4f082a6567c773cc1af387525af7bb22695f448e8921eb341cb2be89a3a: Status 404 returned error can't find the container with id 908ec4f082a6567c773cc1af387525af7bb22695f448e8921eb341cb2be89a3a Dec 05 23:41:47 crc kubenswrapper[4734]: I1205 23:41:47.544141 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:41:47 crc kubenswrapper[4734]: I1205 23:41:47.567639 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:41:47 crc kubenswrapper[4734]: I1205 23:41:47.638298 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e1a3779-44d7-4c3a-89ee-db6e17bcd42e" path="/var/lib/kubelet/pods/0e1a3779-44d7-4c3a-89ee-db6e17bcd42e/volumes" Dec 05 23:41:47 crc kubenswrapper[4734]: I1205 23:41:47.639272 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2236711-73b3-4063-8ca9-a349b15f26b9" path="/var/lib/kubelet/pods/b2236711-73b3-4063-8ca9-a349b15f26b9/volumes" Dec 05 23:41:48 crc kubenswrapper[4734]: I1205 23:41:48.073624 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"90a03731-2e0d-4698-a55e-0af3ef5372be","Type":"ContainerStarted","Data":"908ec4f082a6567c773cc1af387525af7bb22695f448e8921eb341cb2be89a3a"} Dec 05 23:41:48 crc kubenswrapper[4734]: I1205 23:41:48.078086 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dd1e86de-adbe-4725-afd3-37e8ef3d0d39","Type":"ContainerStarted","Data":"1a31348ac28e90362d5f56a88b54b08ebf326cb09cfabfe8ea7a0b1a8e84fd26"} Dec 05 23:41:48 crc kubenswrapper[4734]: I1205 23:41:48.078140 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dd1e86de-adbe-4725-afd3-37e8ef3d0d39","Type":"ContainerStarted","Data":"c44cfd08e384ecddef7611e8283efb7e2741a145c1725865289789e5265e58e7"} Dec 05 23:41:48 crc kubenswrapper[4734]: I1205 23:41:48.078155 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dd1e86de-adbe-4725-afd3-37e8ef3d0d39","Type":"ContainerStarted","Data":"1c52763c1a7485e8391bfcdf4157663864d10ccfeb12cfdd3a9aa8dc2d5e6834"} Dec 05 23:41:48 crc kubenswrapper[4734]: I1205 23:41:48.101045 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 05 23:41:48 crc kubenswrapper[4734]: I1205 23:41:48.121812 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.121767199 podStartE2EDuration="2.121767199s" podCreationTimestamp="2025-12-05 23:41:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:41:48.104337976 +0000 UTC m=+1328.787742262" watchObservedRunningTime="2025-12-05 23:41:48.121767199 +0000 UTC m=+1328.805171485" Dec 05 23:41:48 crc kubenswrapper[4734]: I1205 23:41:48.365365 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-pc6gs"] Dec 05 23:41:48 crc kubenswrapper[4734]: I1205 23:41:48.367089 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pc6gs" Dec 05 23:41:48 crc kubenswrapper[4734]: I1205 23:41:48.372394 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 05 23:41:48 crc kubenswrapper[4734]: I1205 23:41:48.372752 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 05 23:41:48 crc kubenswrapper[4734]: I1205 23:41:48.398155 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-pc6gs"] Dec 05 23:41:48 crc kubenswrapper[4734]: I1205 23:41:48.528623 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dcbaddd-94e5-4096-832c-a3ea35b141b8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pc6gs\" (UID: \"8dcbaddd-94e5-4096-832c-a3ea35b141b8\") " pod="openstack/nova-cell1-cell-mapping-pc6gs" Dec 05 23:41:48 crc kubenswrapper[4734]: I1205 23:41:48.528693 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44m4n\" (UniqueName: \"kubernetes.io/projected/8dcbaddd-94e5-4096-832c-a3ea35b141b8-kube-api-access-44m4n\") pod \"nova-cell1-cell-mapping-pc6gs\" (UID: \"8dcbaddd-94e5-4096-832c-a3ea35b141b8\") " pod="openstack/nova-cell1-cell-mapping-pc6gs" Dec 05 23:41:48 crc kubenswrapper[4734]: I1205 23:41:48.528732 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dcbaddd-94e5-4096-832c-a3ea35b141b8-config-data\") pod \"nova-cell1-cell-mapping-pc6gs\" (UID: \"8dcbaddd-94e5-4096-832c-a3ea35b141b8\") " pod="openstack/nova-cell1-cell-mapping-pc6gs" Dec 05 23:41:48 crc kubenswrapper[4734]: I1205 23:41:48.528858 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dcbaddd-94e5-4096-832c-a3ea35b141b8-scripts\") pod \"nova-cell1-cell-mapping-pc6gs\" (UID: \"8dcbaddd-94e5-4096-832c-a3ea35b141b8\") " pod="openstack/nova-cell1-cell-mapping-pc6gs" Dec 05 23:41:48 crc kubenswrapper[4734]: I1205 23:41:48.631290 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dcbaddd-94e5-4096-832c-a3ea35b141b8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pc6gs\" (UID: \"8dcbaddd-94e5-4096-832c-a3ea35b141b8\") " pod="openstack/nova-cell1-cell-mapping-pc6gs" Dec 05 23:41:48 crc kubenswrapper[4734]: I1205 23:41:48.631367 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44m4n\" (UniqueName: \"kubernetes.io/projected/8dcbaddd-94e5-4096-832c-a3ea35b141b8-kube-api-access-44m4n\") pod \"nova-cell1-cell-mapping-pc6gs\" (UID: \"8dcbaddd-94e5-4096-832c-a3ea35b141b8\") " pod="openstack/nova-cell1-cell-mapping-pc6gs" Dec 05 23:41:48 crc kubenswrapper[4734]: I1205 23:41:48.631434 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dcbaddd-94e5-4096-832c-a3ea35b141b8-config-data\") pod \"nova-cell1-cell-mapping-pc6gs\" (UID: \"8dcbaddd-94e5-4096-832c-a3ea35b141b8\") " pod="openstack/nova-cell1-cell-mapping-pc6gs" Dec 05 23:41:48 crc kubenswrapper[4734]: I1205 23:41:48.631580 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dcbaddd-94e5-4096-832c-a3ea35b141b8-scripts\") pod \"nova-cell1-cell-mapping-pc6gs\" (UID: \"8dcbaddd-94e5-4096-832c-a3ea35b141b8\") " pod="openstack/nova-cell1-cell-mapping-pc6gs" Dec 05 23:41:48 crc kubenswrapper[4734]: I1205 23:41:48.639098 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dcbaddd-94e5-4096-832c-a3ea35b141b8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pc6gs\" (UID: \"8dcbaddd-94e5-4096-832c-a3ea35b141b8\") " pod="openstack/nova-cell1-cell-mapping-pc6gs" Dec 05 23:41:48 crc kubenswrapper[4734]: I1205 23:41:48.639949 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dcbaddd-94e5-4096-832c-a3ea35b141b8-config-data\") pod \"nova-cell1-cell-mapping-pc6gs\" (UID: \"8dcbaddd-94e5-4096-832c-a3ea35b141b8\") " pod="openstack/nova-cell1-cell-mapping-pc6gs" Dec 05 23:41:48 crc kubenswrapper[4734]: I1205 23:41:48.640591 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dcbaddd-94e5-4096-832c-a3ea35b141b8-scripts\") pod \"nova-cell1-cell-mapping-pc6gs\" (UID: \"8dcbaddd-94e5-4096-832c-a3ea35b141b8\") " pod="openstack/nova-cell1-cell-mapping-pc6gs" Dec 05 23:41:48 crc kubenswrapper[4734]: I1205 23:41:48.657084 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44m4n\" (UniqueName: \"kubernetes.io/projected/8dcbaddd-94e5-4096-832c-a3ea35b141b8-kube-api-access-44m4n\") pod \"nova-cell1-cell-mapping-pc6gs\" (UID: \"8dcbaddd-94e5-4096-832c-a3ea35b141b8\") " pod="openstack/nova-cell1-cell-mapping-pc6gs" Dec 05 23:41:48 crc kubenswrapper[4734]: I1205 23:41:48.706659 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pc6gs" Dec 05 23:41:48 crc kubenswrapper[4734]: I1205 23:41:48.727786 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" Dec 05 23:41:48 crc kubenswrapper[4734]: I1205 23:41:48.856256 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-pg2kh"] Dec 05 23:41:48 crc kubenswrapper[4734]: I1205 23:41:48.856955 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" podUID="e5484b60-2312-4287-9d50-4c15f83f9253" containerName="dnsmasq-dns" containerID="cri-o://ece29f0471793b187008d55962f5263fc66b1c51ef70bbe010b455f83f4f8f38" gracePeriod=10 Dec 05 23:41:49 crc kubenswrapper[4734]: I1205 23:41:49.118887 4734 generic.go:334] "Generic (PLEG): container finished" podID="e5484b60-2312-4287-9d50-4c15f83f9253" containerID="ece29f0471793b187008d55962f5263fc66b1c51ef70bbe010b455f83f4f8f38" exitCode=0 Dec 05 23:41:49 crc kubenswrapper[4734]: I1205 23:41:49.118994 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" event={"ID":"e5484b60-2312-4287-9d50-4c15f83f9253","Type":"ContainerDied","Data":"ece29f0471793b187008d55962f5263fc66b1c51ef70bbe010b455f83f4f8f38"} Dec 05 23:41:49 crc kubenswrapper[4734]: I1205 23:41:49.132200 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"90a03731-2e0d-4698-a55e-0af3ef5372be","Type":"ContainerStarted","Data":"2bafa8f5249f14f9434708927aaf4ec7fdcd0855d470bb6c16684110ca3a3616"} Dec 05 23:41:49 crc kubenswrapper[4734]: W1205 23:41:49.433133 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dcbaddd_94e5_4096_832c_a3ea35b141b8.slice/crio-330a17fe73577617de0dca36b5fb714bc8eab60b0295a5f842b8966dad4f4481 WatchSource:0}: Error finding container 330a17fe73577617de0dca36b5fb714bc8eab60b0295a5f842b8966dad4f4481: Status 404 returned error can't find the container with id 330a17fe73577617de0dca36b5fb714bc8eab60b0295a5f842b8966dad4f4481 Dec 05 23:41:49 crc kubenswrapper[4734]: I1205 23:41:49.438619 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-pc6gs"] Dec 05 23:41:49 crc kubenswrapper[4734]: I1205 23:41:49.670685 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" Dec 05 23:41:49 crc kubenswrapper[4734]: I1205 23:41:49.800503 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57n7l\" (UniqueName: \"kubernetes.io/projected/e5484b60-2312-4287-9d50-4c15f83f9253-kube-api-access-57n7l\") pod \"e5484b60-2312-4287-9d50-4c15f83f9253\" (UID: \"e5484b60-2312-4287-9d50-4c15f83f9253\") " Dec 05 23:41:49 crc kubenswrapper[4734]: I1205 23:41:49.801497 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-dns-svc\") pod \"e5484b60-2312-4287-9d50-4c15f83f9253\" (UID: \"e5484b60-2312-4287-9d50-4c15f83f9253\") " Dec 05 23:41:49 crc kubenswrapper[4734]: I1205 23:41:49.801558 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-ovsdbserver-nb\") pod \"e5484b60-2312-4287-9d50-4c15f83f9253\" (UID: \"e5484b60-2312-4287-9d50-4c15f83f9253\") " Dec 05 23:41:49 crc kubenswrapper[4734]: I1205 23:41:49.801595 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-config\") pod \"e5484b60-2312-4287-9d50-4c15f83f9253\" (UID: \"e5484b60-2312-4287-9d50-4c15f83f9253\") " Dec 05 23:41:49 crc kubenswrapper[4734]: I1205 23:41:49.801745 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-dns-swift-storage-0\") pod \"e5484b60-2312-4287-9d50-4c15f83f9253\" (UID: \"e5484b60-2312-4287-9d50-4c15f83f9253\") " Dec 05 23:41:49 crc kubenswrapper[4734]: I1205 23:41:49.801988 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-ovsdbserver-sb\") pod \"e5484b60-2312-4287-9d50-4c15f83f9253\" (UID: \"e5484b60-2312-4287-9d50-4c15f83f9253\") " Dec 05 23:41:49 crc kubenswrapper[4734]: I1205 23:41:49.817491 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5484b60-2312-4287-9d50-4c15f83f9253-kube-api-access-57n7l" (OuterVolumeSpecName: "kube-api-access-57n7l") pod "e5484b60-2312-4287-9d50-4c15f83f9253" (UID: "e5484b60-2312-4287-9d50-4c15f83f9253"). InnerVolumeSpecName "kube-api-access-57n7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:41:49 crc kubenswrapper[4734]: I1205 23:41:49.893047 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e5484b60-2312-4287-9d50-4c15f83f9253" (UID: "e5484b60-2312-4287-9d50-4c15f83f9253"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:41:49 crc kubenswrapper[4734]: I1205 23:41:49.896680 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e5484b60-2312-4287-9d50-4c15f83f9253" (UID: "e5484b60-2312-4287-9d50-4c15f83f9253"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:41:49 crc kubenswrapper[4734]: I1205 23:41:49.904463 4734 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:49 crc kubenswrapper[4734]: I1205 23:41:49.904509 4734 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:49 crc kubenswrapper[4734]: I1205 23:41:49.904541 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57n7l\" (UniqueName: \"kubernetes.io/projected/e5484b60-2312-4287-9d50-4c15f83f9253-kube-api-access-57n7l\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:49 crc kubenswrapper[4734]: I1205 23:41:49.908352 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e5484b60-2312-4287-9d50-4c15f83f9253" (UID: "e5484b60-2312-4287-9d50-4c15f83f9253"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:41:49 crc kubenswrapper[4734]: I1205 23:41:49.930835 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-config" (OuterVolumeSpecName: "config") pod "e5484b60-2312-4287-9d50-4c15f83f9253" (UID: "e5484b60-2312-4287-9d50-4c15f83f9253"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:41:49 crc kubenswrapper[4734]: I1205 23:41:49.935854 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e5484b60-2312-4287-9d50-4c15f83f9253" (UID: "e5484b60-2312-4287-9d50-4c15f83f9253"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:41:50 crc kubenswrapper[4734]: I1205 23:41:50.006882 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:50 crc kubenswrapper[4734]: I1205 23:41:50.006923 4734 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:50 crc kubenswrapper[4734]: I1205 23:41:50.006935 4734 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5484b60-2312-4287-9d50-4c15f83f9253-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:50 crc kubenswrapper[4734]: I1205 23:41:50.146261 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" event={"ID":"e5484b60-2312-4287-9d50-4c15f83f9253","Type":"ContainerDied","Data":"68dde1277a98f29470bf9f7a0af5edf3d0660648981abb8e51eb5b1768df899e"} Dec 05 23:41:50 crc kubenswrapper[4734]: I1205 23:41:50.146321 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-pg2kh" Dec 05 23:41:50 crc kubenswrapper[4734]: I1205 23:41:50.146348 4734 scope.go:117] "RemoveContainer" containerID="ece29f0471793b187008d55962f5263fc66b1c51ef70bbe010b455f83f4f8f38" Dec 05 23:41:50 crc kubenswrapper[4734]: I1205 23:41:50.158611 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"90a03731-2e0d-4698-a55e-0af3ef5372be","Type":"ContainerStarted","Data":"63ac2273fd0e2db2b19b4386d9e54fc33a2dc2eb8b116999089cdc34f6b9dffa"} Dec 05 23:41:50 crc kubenswrapper[4734]: I1205 23:41:50.158681 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"90a03731-2e0d-4698-a55e-0af3ef5372be","Type":"ContainerStarted","Data":"cec143e0cd56c676ddd350d2ea5a73ee2159b485a5cb6873fb161a11fca4097e"} Dec 05 23:41:50 crc kubenswrapper[4734]: I1205 23:41:50.176429 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pc6gs" event={"ID":"8dcbaddd-94e5-4096-832c-a3ea35b141b8","Type":"ContainerStarted","Data":"ed2b1d754974e88c714d0e222f041a774559815332b801c5e97e6f07e8b4b396"} Dec 05 23:41:50 crc kubenswrapper[4734]: I1205 23:41:50.186513 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pc6gs" event={"ID":"8dcbaddd-94e5-4096-832c-a3ea35b141b8","Type":"ContainerStarted","Data":"330a17fe73577617de0dca36b5fb714bc8eab60b0295a5f842b8966dad4f4481"} Dec 05 23:41:50 crc kubenswrapper[4734]: I1205 23:41:50.208134 4734 scope.go:117] "RemoveContainer" containerID="95e07d1707d70b60667d14b890c0a9e0f051ca8c3a5b32a21d9052fc17f66394" Dec 05 23:41:50 crc kubenswrapper[4734]: I1205 23:41:50.231277 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-pg2kh"] Dec 05 23:41:50 crc kubenswrapper[4734]: I1205 23:41:50.240433 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-pg2kh"] Dec 05 23:41:50 crc kubenswrapper[4734]: I1205 23:41:50.247337 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-pc6gs" podStartSLOduration=2.247311743 podStartE2EDuration="2.247311743s" podCreationTimestamp="2025-12-05 23:41:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:41:50.220535793 +0000 UTC m=+1330.903940069" watchObservedRunningTime="2025-12-05 23:41:50.247311743 +0000 UTC m=+1330.930716029" Dec 05 23:41:50 crc kubenswrapper[4734]: I1205 23:41:50.444899 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:41:50 crc kubenswrapper[4734]: I1205 23:41:50.445009 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:41:51 crc kubenswrapper[4734]: I1205 23:41:51.630888 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5484b60-2312-4287-9d50-4c15f83f9253" path="/var/lib/kubelet/pods/e5484b60-2312-4287-9d50-4c15f83f9253/volumes" Dec 05 23:41:52 crc kubenswrapper[4734]: I1205 23:41:52.208624 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"90a03731-2e0d-4698-a55e-0af3ef5372be","Type":"ContainerStarted","Data":"6b96b9bb9e0f47fe84f2f4aed60e1bfbd5a1559a223a41e49819063ce24c0550"} Dec 05 23:41:52 crc kubenswrapper[4734]: I1205 23:41:52.209194 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 23:41:52 crc kubenswrapper[4734]: I1205 23:41:52.244331 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.424327876 podStartE2EDuration="6.244304525s" podCreationTimestamp="2025-12-05 23:41:46 +0000 UTC" firstStartedPulling="2025-12-05 23:41:47.253870604 +0000 UTC m=+1327.937274880" lastFinishedPulling="2025-12-05 23:41:51.073847253 +0000 UTC m=+1331.757251529" observedRunningTime="2025-12-05 23:41:52.232802956 +0000 UTC m=+1332.916207232" watchObservedRunningTime="2025-12-05 23:41:52.244304525 +0000 UTC m=+1332.927708801" Dec 05 23:41:56 crc kubenswrapper[4734]: I1205 23:41:56.567678 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 23:41:56 crc kubenswrapper[4734]: I1205 23:41:56.568477 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 23:41:57 crc kubenswrapper[4734]: I1205 23:41:57.279573 4734 generic.go:334] "Generic (PLEG): container finished" podID="8dcbaddd-94e5-4096-832c-a3ea35b141b8" containerID="ed2b1d754974e88c714d0e222f041a774559815332b801c5e97e6f07e8b4b396" exitCode=0 Dec 05 23:41:57 crc kubenswrapper[4734]: I1205 23:41:57.279632 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pc6gs" event={"ID":"8dcbaddd-94e5-4096-832c-a3ea35b141b8","Type":"ContainerDied","Data":"ed2b1d754974e88c714d0e222f041a774559815332b801c5e97e6f07e8b4b396"} Dec 05 23:41:57 crc kubenswrapper[4734]: I1205 23:41:57.577864 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dd1e86de-adbe-4725-afd3-37e8ef3d0d39" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.198:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 23:41:57 crc kubenswrapper[4734]: I1205 23:41:57.583782 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dd1e86de-adbe-4725-afd3-37e8ef3d0d39" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.198:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 23:41:58 crc kubenswrapper[4734]: I1205 23:41:58.735413 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pc6gs" Dec 05 23:41:58 crc kubenswrapper[4734]: I1205 23:41:58.834364 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dcbaddd-94e5-4096-832c-a3ea35b141b8-combined-ca-bundle\") pod \"8dcbaddd-94e5-4096-832c-a3ea35b141b8\" (UID: \"8dcbaddd-94e5-4096-832c-a3ea35b141b8\") " Dec 05 23:41:58 crc kubenswrapper[4734]: I1205 23:41:58.834589 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44m4n\" (UniqueName: \"kubernetes.io/projected/8dcbaddd-94e5-4096-832c-a3ea35b141b8-kube-api-access-44m4n\") pod \"8dcbaddd-94e5-4096-832c-a3ea35b141b8\" (UID: \"8dcbaddd-94e5-4096-832c-a3ea35b141b8\") " Dec 05 23:41:58 crc kubenswrapper[4734]: I1205 23:41:58.834636 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dcbaddd-94e5-4096-832c-a3ea35b141b8-config-data\") pod \"8dcbaddd-94e5-4096-832c-a3ea35b141b8\" (UID: \"8dcbaddd-94e5-4096-832c-a3ea35b141b8\") " Dec 05 23:41:58 crc kubenswrapper[4734]: I1205 23:41:58.834782 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dcbaddd-94e5-4096-832c-a3ea35b141b8-scripts\") pod \"8dcbaddd-94e5-4096-832c-a3ea35b141b8\" (UID: \"8dcbaddd-94e5-4096-832c-a3ea35b141b8\") " Dec 05 23:41:58 crc kubenswrapper[4734]: I1205 23:41:58.842684 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dcbaddd-94e5-4096-832c-a3ea35b141b8-scripts" (OuterVolumeSpecName: "scripts") pod "8dcbaddd-94e5-4096-832c-a3ea35b141b8" (UID: "8dcbaddd-94e5-4096-832c-a3ea35b141b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:41:58 crc kubenswrapper[4734]: I1205 23:41:58.843598 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dcbaddd-94e5-4096-832c-a3ea35b141b8-kube-api-access-44m4n" (OuterVolumeSpecName: "kube-api-access-44m4n") pod "8dcbaddd-94e5-4096-832c-a3ea35b141b8" (UID: "8dcbaddd-94e5-4096-832c-a3ea35b141b8"). InnerVolumeSpecName "kube-api-access-44m4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:41:58 crc kubenswrapper[4734]: I1205 23:41:58.871748 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dcbaddd-94e5-4096-832c-a3ea35b141b8-config-data" (OuterVolumeSpecName: "config-data") pod "8dcbaddd-94e5-4096-832c-a3ea35b141b8" (UID: "8dcbaddd-94e5-4096-832c-a3ea35b141b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:41:58 crc kubenswrapper[4734]: I1205 23:41:58.874587 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dcbaddd-94e5-4096-832c-a3ea35b141b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8dcbaddd-94e5-4096-832c-a3ea35b141b8" (UID: "8dcbaddd-94e5-4096-832c-a3ea35b141b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:41:58 crc kubenswrapper[4734]: I1205 23:41:58.949878 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dcbaddd-94e5-4096-832c-a3ea35b141b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:58 crc kubenswrapper[4734]: I1205 23:41:58.950110 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44m4n\" (UniqueName: \"kubernetes.io/projected/8dcbaddd-94e5-4096-832c-a3ea35b141b8-kube-api-access-44m4n\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:58 crc kubenswrapper[4734]: I1205 23:41:58.950206 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dcbaddd-94e5-4096-832c-a3ea35b141b8-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:58 crc kubenswrapper[4734]: I1205 23:41:58.950334 4734 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dcbaddd-94e5-4096-832c-a3ea35b141b8-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 23:41:59 crc kubenswrapper[4734]: I1205 23:41:59.303146 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pc6gs" event={"ID":"8dcbaddd-94e5-4096-832c-a3ea35b141b8","Type":"ContainerDied","Data":"330a17fe73577617de0dca36b5fb714bc8eab60b0295a5f842b8966dad4f4481"} Dec 05 23:41:59 crc kubenswrapper[4734]: I1205 23:41:59.303799 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="330a17fe73577617de0dca36b5fb714bc8eab60b0295a5f842b8966dad4f4481" Dec 05 23:41:59 crc kubenswrapper[4734]: I1205 23:41:59.303472 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pc6gs" Dec 05 23:41:59 crc kubenswrapper[4734]: I1205 23:41:59.515292 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 23:41:59 crc kubenswrapper[4734]: I1205 23:41:59.515713 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="adf827fa-f9ac-4c4b-bf90-65a59438d9b6" containerName="nova-scheduler-scheduler" containerID="cri-o://902eb3e8ddd9fa5489a4816e63c558c1da48727d14669e097fef551dabc0b6da" gracePeriod=30 Dec 05 23:41:59 crc kubenswrapper[4734]: I1205 23:41:59.529261 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 23:41:59 crc kubenswrapper[4734]: I1205 23:41:59.529590 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dd1e86de-adbe-4725-afd3-37e8ef3d0d39" containerName="nova-api-log" containerID="cri-o://c44cfd08e384ecddef7611e8283efb7e2741a145c1725865289789e5265e58e7" gracePeriod=30 Dec 05 23:41:59 crc kubenswrapper[4734]: I1205 23:41:59.529665 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dd1e86de-adbe-4725-afd3-37e8ef3d0d39" containerName="nova-api-api" containerID="cri-o://1a31348ac28e90362d5f56a88b54b08ebf326cb09cfabfe8ea7a0b1a8e84fd26" gracePeriod=30 Dec 05 23:41:59 crc kubenswrapper[4734]: I1205 23:41:59.563918 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 23:41:59 crc kubenswrapper[4734]: I1205 23:41:59.564314 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1f9ea4e7-3820-41f2-9232-a79f9f1091ac" containerName="nova-metadata-log" containerID="cri-o://162bda318be27c2b91bb6ad6b2d64f0f00ebc952462a8c3be681352cdb7c71e1" gracePeriod=30 Dec 05 23:41:59 crc kubenswrapper[4734]: I1205 23:41:59.564484 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1f9ea4e7-3820-41f2-9232-a79f9f1091ac" containerName="nova-metadata-metadata" containerID="cri-o://018ec382f9db53d69ad6e01565326aa41397b51fb465e9293103e638af38fc58" gracePeriod=30 Dec 05 23:42:00 crc kubenswrapper[4734]: I1205 23:42:00.316303 4734 generic.go:334] "Generic (PLEG): container finished" podID="1f9ea4e7-3820-41f2-9232-a79f9f1091ac" containerID="162bda318be27c2b91bb6ad6b2d64f0f00ebc952462a8c3be681352cdb7c71e1" exitCode=143 Dec 05 23:42:00 crc kubenswrapper[4734]: I1205 23:42:00.316399 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1f9ea4e7-3820-41f2-9232-a79f9f1091ac","Type":"ContainerDied","Data":"162bda318be27c2b91bb6ad6b2d64f0f00ebc952462a8c3be681352cdb7c71e1"} Dec 05 23:42:00 crc kubenswrapper[4734]: I1205 23:42:00.319210 4734 generic.go:334] "Generic (PLEG): container finished" podID="dd1e86de-adbe-4725-afd3-37e8ef3d0d39" containerID="c44cfd08e384ecddef7611e8283efb7e2741a145c1725865289789e5265e58e7" exitCode=143 Dec 05 23:42:00 crc kubenswrapper[4734]: I1205 23:42:00.319254 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dd1e86de-adbe-4725-afd3-37e8ef3d0d39","Type":"ContainerDied","Data":"c44cfd08e384ecddef7611e8283efb7e2741a145c1725865289789e5265e58e7"} Dec 05 23:42:02 crc kubenswrapper[4734]: I1205 23:42:02.697815 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="1f9ea4e7-3820-41f2-9232-a79f9f1091ac" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:48466->10.217.0.193:8775: read: connection reset by peer" Dec 05 23:42:02 crc kubenswrapper[4734]: I1205 23:42:02.697844 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="1f9ea4e7-3820-41f2-9232-a79f9f1091ac" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:48468->10.217.0.193:8775: read: connection reset by peer" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.213512 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.351948 4734 generic.go:334] "Generic (PLEG): container finished" podID="1f9ea4e7-3820-41f2-9232-a79f9f1091ac" containerID="018ec382f9db53d69ad6e01565326aa41397b51fb465e9293103e638af38fc58" exitCode=0 Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.352009 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-combined-ca-bundle\") pod \"1f9ea4e7-3820-41f2-9232-a79f9f1091ac\" (UID: \"1f9ea4e7-3820-41f2-9232-a79f9f1091ac\") " Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.352131 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.352659 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-config-data\") pod \"1f9ea4e7-3820-41f2-9232-a79f9f1091ac\" (UID: \"1f9ea4e7-3820-41f2-9232-a79f9f1091ac\") " Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.352707 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-nova-metadata-tls-certs\") pod \"1f9ea4e7-3820-41f2-9232-a79f9f1091ac\" (UID: \"1f9ea4e7-3820-41f2-9232-a79f9f1091ac\") " Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.352163 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1f9ea4e7-3820-41f2-9232-a79f9f1091ac","Type":"ContainerDied","Data":"018ec382f9db53d69ad6e01565326aa41397b51fb465e9293103e638af38fc58"} Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.352782 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1f9ea4e7-3820-41f2-9232-a79f9f1091ac","Type":"ContainerDied","Data":"6793fcd26d871a7d050c131cdc9974367cf22506787cc690c02d52e31cbae85b"} Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.352815 4734 scope.go:117] "RemoveContainer" containerID="018ec382f9db53d69ad6e01565326aa41397b51fb465e9293103e638af38fc58" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.353193 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvv4b\" (UniqueName: \"kubernetes.io/projected/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-kube-api-access-hvv4b\") pod \"1f9ea4e7-3820-41f2-9232-a79f9f1091ac\" (UID: \"1f9ea4e7-3820-41f2-9232-a79f9f1091ac\") " Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.353248 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-logs\") pod \"1f9ea4e7-3820-41f2-9232-a79f9f1091ac\" (UID: \"1f9ea4e7-3820-41f2-9232-a79f9f1091ac\") " Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.354202 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-logs" (OuterVolumeSpecName: "logs") pod "1f9ea4e7-3820-41f2-9232-a79f9f1091ac" (UID: "1f9ea4e7-3820-41f2-9232-a79f9f1091ac"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.359716 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-kube-api-access-hvv4b" (OuterVolumeSpecName: "kube-api-access-hvv4b") pod "1f9ea4e7-3820-41f2-9232-a79f9f1091ac" (UID: "1f9ea4e7-3820-41f2-9232-a79f9f1091ac"). InnerVolumeSpecName "kube-api-access-hvv4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.396698 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-config-data" (OuterVolumeSpecName: "config-data") pod "1f9ea4e7-3820-41f2-9232-a79f9f1091ac" (UID: "1f9ea4e7-3820-41f2-9232-a79f9f1091ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.399961 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f9ea4e7-3820-41f2-9232-a79f9f1091ac" (UID: "1f9ea4e7-3820-41f2-9232-a79f9f1091ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.418480 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "1f9ea4e7-3820-41f2-9232-a79f9f1091ac" (UID: "1f9ea4e7-3820-41f2-9232-a79f9f1091ac"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.457203 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.457242 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.457252 4734 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.457262 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvv4b\" (UniqueName: \"kubernetes.io/projected/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-kube-api-access-hvv4b\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.457272 4734 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f9ea4e7-3820-41f2-9232-a79f9f1091ac-logs\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.559813 4734 scope.go:117] "RemoveContainer" containerID="162bda318be27c2b91bb6ad6b2d64f0f00ebc952462a8c3be681352cdb7c71e1" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.588271 4734 scope.go:117] "RemoveContainer" containerID="018ec382f9db53d69ad6e01565326aa41397b51fb465e9293103e638af38fc58" Dec 05 23:42:03 crc kubenswrapper[4734]: E1205 23:42:03.589178 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"018ec382f9db53d69ad6e01565326aa41397b51fb465e9293103e638af38fc58\": container with ID starting with 018ec382f9db53d69ad6e01565326aa41397b51fb465e9293103e638af38fc58 not found: ID does not exist" containerID="018ec382f9db53d69ad6e01565326aa41397b51fb465e9293103e638af38fc58" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.589226 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"018ec382f9db53d69ad6e01565326aa41397b51fb465e9293103e638af38fc58"} err="failed to get container status \"018ec382f9db53d69ad6e01565326aa41397b51fb465e9293103e638af38fc58\": rpc error: code = NotFound desc = could not find container \"018ec382f9db53d69ad6e01565326aa41397b51fb465e9293103e638af38fc58\": container with ID starting with 018ec382f9db53d69ad6e01565326aa41397b51fb465e9293103e638af38fc58 not found: ID does not exist" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.589259 4734 scope.go:117] "RemoveContainer" containerID="162bda318be27c2b91bb6ad6b2d64f0f00ebc952462a8c3be681352cdb7c71e1" Dec 05 23:42:03 crc kubenswrapper[4734]: E1205 23:42:03.590228 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"162bda318be27c2b91bb6ad6b2d64f0f00ebc952462a8c3be681352cdb7c71e1\": container with ID starting with 162bda318be27c2b91bb6ad6b2d64f0f00ebc952462a8c3be681352cdb7c71e1 not found: ID does not exist" containerID="162bda318be27c2b91bb6ad6b2d64f0f00ebc952462a8c3be681352cdb7c71e1" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.590292 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"162bda318be27c2b91bb6ad6b2d64f0f00ebc952462a8c3be681352cdb7c71e1"} err="failed to get container status \"162bda318be27c2b91bb6ad6b2d64f0f00ebc952462a8c3be681352cdb7c71e1\": rpc error: code = NotFound desc = could not find container \"162bda318be27c2b91bb6ad6b2d64f0f00ebc952462a8c3be681352cdb7c71e1\": container with ID starting with 162bda318be27c2b91bb6ad6b2d64f0f00ebc952462a8c3be681352cdb7c71e1 not found: ID does not exist" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.690989 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.715605 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.745844 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 23:42:03 crc kubenswrapper[4734]: E1205 23:42:03.747393 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f9ea4e7-3820-41f2-9232-a79f9f1091ac" containerName="nova-metadata-metadata" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.747430 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f9ea4e7-3820-41f2-9232-a79f9f1091ac" containerName="nova-metadata-metadata" Dec 05 23:42:03 crc kubenswrapper[4734]: E1205 23:42:03.747459 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dcbaddd-94e5-4096-832c-a3ea35b141b8" containerName="nova-manage" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.747470 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dcbaddd-94e5-4096-832c-a3ea35b141b8" containerName="nova-manage" Dec 05 23:42:03 crc kubenswrapper[4734]: E1205 23:42:03.747653 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f9ea4e7-3820-41f2-9232-a79f9f1091ac" containerName="nova-metadata-log" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.747666 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f9ea4e7-3820-41f2-9232-a79f9f1091ac" containerName="nova-metadata-log" Dec 05 23:42:03 crc kubenswrapper[4734]: E1205 23:42:03.747701 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5484b60-2312-4287-9d50-4c15f83f9253" containerName="init" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.747710 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5484b60-2312-4287-9d50-4c15f83f9253" containerName="init" Dec 05 23:42:03 crc kubenswrapper[4734]: E1205 23:42:03.747734 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5484b60-2312-4287-9d50-4c15f83f9253" containerName="dnsmasq-dns" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.747744 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5484b60-2312-4287-9d50-4c15f83f9253" containerName="dnsmasq-dns" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.748255 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f9ea4e7-3820-41f2-9232-a79f9f1091ac" containerName="nova-metadata-metadata" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.748316 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f9ea4e7-3820-41f2-9232-a79f9f1091ac" containerName="nova-metadata-log" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.748348 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5484b60-2312-4287-9d50-4c15f83f9253" containerName="dnsmasq-dns" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.748377 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dcbaddd-94e5-4096-832c-a3ea35b141b8" containerName="nova-manage" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.767498 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.772038 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.773963 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.773968 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.869917 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6929b8c5-4cb9-49cd-a084-d578657ce0bf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6929b8c5-4cb9-49cd-a084-d578657ce0bf\") " pod="openstack/nova-metadata-0" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.870049 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6929b8c5-4cb9-49cd-a084-d578657ce0bf-config-data\") pod \"nova-metadata-0\" (UID: \"6929b8c5-4cb9-49cd-a084-d578657ce0bf\") " pod="openstack/nova-metadata-0" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.870141 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6929b8c5-4cb9-49cd-a084-d578657ce0bf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6929b8c5-4cb9-49cd-a084-d578657ce0bf\") " pod="openstack/nova-metadata-0" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.870601 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6929b8c5-4cb9-49cd-a084-d578657ce0bf-logs\") pod \"nova-metadata-0\" (UID: \"6929b8c5-4cb9-49cd-a084-d578657ce0bf\") " pod="openstack/nova-metadata-0" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.870756 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd7bg\" (UniqueName: \"kubernetes.io/projected/6929b8c5-4cb9-49cd-a084-d578657ce0bf-kube-api-access-jd7bg\") pod \"nova-metadata-0\" (UID: \"6929b8c5-4cb9-49cd-a084-d578657ce0bf\") " pod="openstack/nova-metadata-0" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.898208 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.972721 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plzcq\" (UniqueName: \"kubernetes.io/projected/adf827fa-f9ac-4c4b-bf90-65a59438d9b6-kube-api-access-plzcq\") pod \"adf827fa-f9ac-4c4b-bf90-65a59438d9b6\" (UID: \"adf827fa-f9ac-4c4b-bf90-65a59438d9b6\") " Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.972881 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adf827fa-f9ac-4c4b-bf90-65a59438d9b6-combined-ca-bundle\") pod \"adf827fa-f9ac-4c4b-bf90-65a59438d9b6\" (UID: \"adf827fa-f9ac-4c4b-bf90-65a59438d9b6\") " Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.973126 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adf827fa-f9ac-4c4b-bf90-65a59438d9b6-config-data\") pod \"adf827fa-f9ac-4c4b-bf90-65a59438d9b6\" (UID: \"adf827fa-f9ac-4c4b-bf90-65a59438d9b6\") " Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.973599 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6929b8c5-4cb9-49cd-a084-d578657ce0bf-logs\") pod \"nova-metadata-0\" (UID: \"6929b8c5-4cb9-49cd-a084-d578657ce0bf\") " pod="openstack/nova-metadata-0" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.973708 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd7bg\" (UniqueName: \"kubernetes.io/projected/6929b8c5-4cb9-49cd-a084-d578657ce0bf-kube-api-access-jd7bg\") pod \"nova-metadata-0\" (UID: \"6929b8c5-4cb9-49cd-a084-d578657ce0bf\") " pod="openstack/nova-metadata-0" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.973844 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6929b8c5-4cb9-49cd-a084-d578657ce0bf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6929b8c5-4cb9-49cd-a084-d578657ce0bf\") " pod="openstack/nova-metadata-0" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.973876 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6929b8c5-4cb9-49cd-a084-d578657ce0bf-config-data\") pod \"nova-metadata-0\" (UID: \"6929b8c5-4cb9-49cd-a084-d578657ce0bf\") " pod="openstack/nova-metadata-0" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.973924 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6929b8c5-4cb9-49cd-a084-d578657ce0bf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6929b8c5-4cb9-49cd-a084-d578657ce0bf\") " pod="openstack/nova-metadata-0" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.974078 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6929b8c5-4cb9-49cd-a084-d578657ce0bf-logs\") pod \"nova-metadata-0\" (UID: \"6929b8c5-4cb9-49cd-a084-d578657ce0bf\") " pod="openstack/nova-metadata-0" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.980621 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6929b8c5-4cb9-49cd-a084-d578657ce0bf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6929b8c5-4cb9-49cd-a084-d578657ce0bf\") " pod="openstack/nova-metadata-0" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.982330 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6929b8c5-4cb9-49cd-a084-d578657ce0bf-config-data\") pod \"nova-metadata-0\" (UID: \"6929b8c5-4cb9-49cd-a084-d578657ce0bf\") " pod="openstack/nova-metadata-0" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.982692 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6929b8c5-4cb9-49cd-a084-d578657ce0bf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6929b8c5-4cb9-49cd-a084-d578657ce0bf\") " pod="openstack/nova-metadata-0" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.989403 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adf827fa-f9ac-4c4b-bf90-65a59438d9b6-kube-api-access-plzcq" (OuterVolumeSpecName: "kube-api-access-plzcq") pod "adf827fa-f9ac-4c4b-bf90-65a59438d9b6" (UID: "adf827fa-f9ac-4c4b-bf90-65a59438d9b6"). InnerVolumeSpecName "kube-api-access-plzcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:42:03 crc kubenswrapper[4734]: I1205 23:42:03.996334 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd7bg\" (UniqueName: \"kubernetes.io/projected/6929b8c5-4cb9-49cd-a084-d578657ce0bf-kube-api-access-jd7bg\") pod \"nova-metadata-0\" (UID: \"6929b8c5-4cb9-49cd-a084-d578657ce0bf\") " pod="openstack/nova-metadata-0" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.011741 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adf827fa-f9ac-4c4b-bf90-65a59438d9b6-config-data" (OuterVolumeSpecName: "config-data") pod "adf827fa-f9ac-4c4b-bf90-65a59438d9b6" (UID: "adf827fa-f9ac-4c4b-bf90-65a59438d9b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.017881 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adf827fa-f9ac-4c4b-bf90-65a59438d9b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "adf827fa-f9ac-4c4b-bf90-65a59438d9b6" (UID: "adf827fa-f9ac-4c4b-bf90-65a59438d9b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.079280 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adf827fa-f9ac-4c4b-bf90-65a59438d9b6-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.079326 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plzcq\" (UniqueName: \"kubernetes.io/projected/adf827fa-f9ac-4c4b-bf90-65a59438d9b6-kube-api-access-plzcq\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.079340 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adf827fa-f9ac-4c4b-bf90-65a59438d9b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.193682 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 23:42:04 crc kubenswrapper[4734]: E1205 23:42:04.331365 4734 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd1e86de_adbe_4725_afd3_37e8ef3d0d39.slice/crio-conmon-1a31348ac28e90362d5f56a88b54b08ebf326cb09cfabfe8ea7a0b1a8e84fd26.scope\": RecentStats: unable to find data in memory cache]" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.385007 4734 generic.go:334] "Generic (PLEG): container finished" podID="dd1e86de-adbe-4725-afd3-37e8ef3d0d39" containerID="1a31348ac28e90362d5f56a88b54b08ebf326cb09cfabfe8ea7a0b1a8e84fd26" exitCode=0 Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.385071 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dd1e86de-adbe-4725-afd3-37e8ef3d0d39","Type":"ContainerDied","Data":"1a31348ac28e90362d5f56a88b54b08ebf326cb09cfabfe8ea7a0b1a8e84fd26"} Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.388672 4734 generic.go:334] "Generic (PLEG): container finished" podID="adf827fa-f9ac-4c4b-bf90-65a59438d9b6" containerID="902eb3e8ddd9fa5489a4816e63c558c1da48727d14669e097fef551dabc0b6da" exitCode=0 Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.388770 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"adf827fa-f9ac-4c4b-bf90-65a59438d9b6","Type":"ContainerDied","Data":"902eb3e8ddd9fa5489a4816e63c558c1da48727d14669e097fef551dabc0b6da"} Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.388841 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"adf827fa-f9ac-4c4b-bf90-65a59438d9b6","Type":"ContainerDied","Data":"1dd9f085d37d6ebe5383e22604ab3413f99ab061b563704a3d3c272fb2bfcc18"} Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.388874 4734 scope.go:117] "RemoveContainer" containerID="902eb3e8ddd9fa5489a4816e63c558c1da48727d14669e097fef551dabc0b6da" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.389155 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.423657 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.471705 4734 scope.go:117] "RemoveContainer" containerID="902eb3e8ddd9fa5489a4816e63c558c1da48727d14669e097fef551dabc0b6da" Dec 05 23:42:04 crc kubenswrapper[4734]: E1205 23:42:04.475382 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"902eb3e8ddd9fa5489a4816e63c558c1da48727d14669e097fef551dabc0b6da\": container with ID starting with 902eb3e8ddd9fa5489a4816e63c558c1da48727d14669e097fef551dabc0b6da not found: ID does not exist" containerID="902eb3e8ddd9fa5489a4816e63c558c1da48727d14669e097fef551dabc0b6da" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.475427 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"902eb3e8ddd9fa5489a4816e63c558c1da48727d14669e097fef551dabc0b6da"} err="failed to get container status \"902eb3e8ddd9fa5489a4816e63c558c1da48727d14669e097fef551dabc0b6da\": rpc error: code = NotFound desc = could not find container \"902eb3e8ddd9fa5489a4816e63c558c1da48727d14669e097fef551dabc0b6da\": container with ID starting with 902eb3e8ddd9fa5489a4816e63c558c1da48727d14669e097fef551dabc0b6da not found: ID does not exist" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.483019 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.515746 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.528229 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 23:42:04 crc kubenswrapper[4734]: E1205 23:42:04.528868 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1e86de-adbe-4725-afd3-37e8ef3d0d39" containerName="nova-api-log" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.528893 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1e86de-adbe-4725-afd3-37e8ef3d0d39" containerName="nova-api-log" Dec 05 23:42:04 crc kubenswrapper[4734]: E1205 23:42:04.528917 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf827fa-f9ac-4c4b-bf90-65a59438d9b6" containerName="nova-scheduler-scheduler" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.528923 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf827fa-f9ac-4c4b-bf90-65a59438d9b6" containerName="nova-scheduler-scheduler" Dec 05 23:42:04 crc kubenswrapper[4734]: E1205 23:42:04.528956 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1e86de-adbe-4725-afd3-37e8ef3d0d39" containerName="nova-api-api" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.528967 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1e86de-adbe-4725-afd3-37e8ef3d0d39" containerName="nova-api-api" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.529272 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf827fa-f9ac-4c4b-bf90-65a59438d9b6" containerName="nova-scheduler-scheduler" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.529319 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd1e86de-adbe-4725-afd3-37e8ef3d0d39" containerName="nova-api-api" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.529385 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd1e86de-adbe-4725-afd3-37e8ef3d0d39" containerName="nova-api-log" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.530735 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.535069 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.541787 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.592693 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-combined-ca-bundle\") pod \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\" (UID: \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\") " Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.592840 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-internal-tls-certs\") pod \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\" (UID: \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\") " Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.592872 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcwgw\" (UniqueName: \"kubernetes.io/projected/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-kube-api-access-zcwgw\") pod \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\" (UID: \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\") " Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.593057 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-public-tls-certs\") pod \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\" (UID: \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\") " Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.593085 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-logs\") pod \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\" (UID: \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\") " Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.593113 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-config-data\") pod \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\" (UID: \"dd1e86de-adbe-4725-afd3-37e8ef3d0d39\") " Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.594875 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-logs" (OuterVolumeSpecName: "logs") pod "dd1e86de-adbe-4725-afd3-37e8ef3d0d39" (UID: "dd1e86de-adbe-4725-afd3-37e8ef3d0d39"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.600910 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-kube-api-access-zcwgw" (OuterVolumeSpecName: "kube-api-access-zcwgw") pod "dd1e86de-adbe-4725-afd3-37e8ef3d0d39" (UID: "dd1e86de-adbe-4725-afd3-37e8ef3d0d39"). InnerVolumeSpecName "kube-api-access-zcwgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.631833 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-config-data" (OuterVolumeSpecName: "config-data") pod "dd1e86de-adbe-4725-afd3-37e8ef3d0d39" (UID: "dd1e86de-adbe-4725-afd3-37e8ef3d0d39"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.634736 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd1e86de-adbe-4725-afd3-37e8ef3d0d39" (UID: "dd1e86de-adbe-4725-afd3-37e8ef3d0d39"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.659331 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dd1e86de-adbe-4725-afd3-37e8ef3d0d39" (UID: "dd1e86de-adbe-4725-afd3-37e8ef3d0d39"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.663552 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "dd1e86de-adbe-4725-afd3-37e8ef3d0d39" (UID: "dd1e86de-adbe-4725-afd3-37e8ef3d0d39"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.696559 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djng2\" (UniqueName: \"kubernetes.io/projected/77364fbf-3dbe-45c3-adf1-94410f61f0ce-kube-api-access-djng2\") pod \"nova-scheduler-0\" (UID: \"77364fbf-3dbe-45c3-adf1-94410f61f0ce\") " pod="openstack/nova-scheduler-0" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.696649 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77364fbf-3dbe-45c3-adf1-94410f61f0ce-config-data\") pod \"nova-scheduler-0\" (UID: \"77364fbf-3dbe-45c3-adf1-94410f61f0ce\") " pod="openstack/nova-scheduler-0" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.696784 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77364fbf-3dbe-45c3-adf1-94410f61f0ce-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"77364fbf-3dbe-45c3-adf1-94410f61f0ce\") " pod="openstack/nova-scheduler-0" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.696920 4734 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.696933 4734 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-logs\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.696944 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.696953 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.696963 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcwgw\" (UniqueName: \"kubernetes.io/projected/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-kube-api-access-zcwgw\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.696973 4734 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1e86de-adbe-4725-afd3-37e8ef3d0d39-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.781754 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 23:42:04 crc kubenswrapper[4734]: W1205 23:42:04.794248 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6929b8c5_4cb9_49cd_a084_d578657ce0bf.slice/crio-4651e5b9b052ec72a58d963d22603a543befb3d7a196ca5cd3e4ed2b69a382d2 WatchSource:0}: Error finding container 4651e5b9b052ec72a58d963d22603a543befb3d7a196ca5cd3e4ed2b69a382d2: Status 404 returned error can't find the container with id 4651e5b9b052ec72a58d963d22603a543befb3d7a196ca5cd3e4ed2b69a382d2 Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.799065 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djng2\" (UniqueName: \"kubernetes.io/projected/77364fbf-3dbe-45c3-adf1-94410f61f0ce-kube-api-access-djng2\") pod \"nova-scheduler-0\" (UID: \"77364fbf-3dbe-45c3-adf1-94410f61f0ce\") " pod="openstack/nova-scheduler-0" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.799175 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77364fbf-3dbe-45c3-adf1-94410f61f0ce-config-data\") pod \"nova-scheduler-0\" (UID: \"77364fbf-3dbe-45c3-adf1-94410f61f0ce\") " pod="openstack/nova-scheduler-0" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.799401 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77364fbf-3dbe-45c3-adf1-94410f61f0ce-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"77364fbf-3dbe-45c3-adf1-94410f61f0ce\") " pod="openstack/nova-scheduler-0" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.804550 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77364fbf-3dbe-45c3-adf1-94410f61f0ce-config-data\") pod \"nova-scheduler-0\" (UID: \"77364fbf-3dbe-45c3-adf1-94410f61f0ce\") " pod="openstack/nova-scheduler-0" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.805357 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77364fbf-3dbe-45c3-adf1-94410f61f0ce-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"77364fbf-3dbe-45c3-adf1-94410f61f0ce\") " pod="openstack/nova-scheduler-0" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.816837 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djng2\" (UniqueName: \"kubernetes.io/projected/77364fbf-3dbe-45c3-adf1-94410f61f0ce-kube-api-access-djng2\") pod \"nova-scheduler-0\" (UID: \"77364fbf-3dbe-45c3-adf1-94410f61f0ce\") " pod="openstack/nova-scheduler-0" Dec 05 23:42:04 crc kubenswrapper[4734]: I1205 23:42:04.854555 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.390067 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.403877 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6929b8c5-4cb9-49cd-a084-d578657ce0bf","Type":"ContainerStarted","Data":"7eee7c2b49624f6d0454ceee999a7f5599e2e06c5586760d9d131f4945ad8358"} Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.403942 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6929b8c5-4cb9-49cd-a084-d578657ce0bf","Type":"ContainerStarted","Data":"5fca4311e4ecf7a71f8c51a2b551d6c8c0eaa8509b8a4750bffcd1f9f891a15e"} Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.403955 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6929b8c5-4cb9-49cd-a084-d578657ce0bf","Type":"ContainerStarted","Data":"4651e5b9b052ec72a58d963d22603a543befb3d7a196ca5cd3e4ed2b69a382d2"} Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.417841 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dd1e86de-adbe-4725-afd3-37e8ef3d0d39","Type":"ContainerDied","Data":"1c52763c1a7485e8391bfcdf4157663864d10ccfeb12cfdd3a9aa8dc2d5e6834"} Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.417899 4734 scope.go:117] "RemoveContainer" containerID="1a31348ac28e90362d5f56a88b54b08ebf326cb09cfabfe8ea7a0b1a8e84fd26" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.418036 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.447432 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.447404773 podStartE2EDuration="2.447404773s" podCreationTimestamp="2025-12-05 23:42:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:42:05.425980143 +0000 UTC m=+1346.109384419" watchObservedRunningTime="2025-12-05 23:42:05.447404773 +0000 UTC m=+1346.130809049" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.467255 4734 scope.go:117] "RemoveContainer" containerID="c44cfd08e384ecddef7611e8283efb7e2741a145c1725865289789e5265e58e7" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.476055 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.498871 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.522337 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.524808 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.528116 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.528234 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.528308 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.530996 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.627698 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f9ea4e7-3820-41f2-9232-a79f9f1091ac" path="/var/lib/kubelet/pods/1f9ea4e7-3820-41f2-9232-a79f9f1091ac/volumes" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.628476 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adf827fa-f9ac-4c4b-bf90-65a59438d9b6" path="/var/lib/kubelet/pods/adf827fa-f9ac-4c4b-bf90-65a59438d9b6/volumes" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.629234 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd1e86de-adbe-4725-afd3-37e8ef3d0d39" path="/var/lib/kubelet/pods/dd1e86de-adbe-4725-afd3-37e8ef3d0d39/volumes" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.638194 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfdwv\" (UniqueName: \"kubernetes.io/projected/fe37850d-71e6-4310-9c74-b98b792cecc4-kube-api-access-zfdwv\") pod \"nova-api-0\" (UID: \"fe37850d-71e6-4310-9c74-b98b792cecc4\") " pod="openstack/nova-api-0" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.638248 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe37850d-71e6-4310-9c74-b98b792cecc4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fe37850d-71e6-4310-9c74-b98b792cecc4\") " pod="openstack/nova-api-0" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.638278 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe37850d-71e6-4310-9c74-b98b792cecc4-public-tls-certs\") pod \"nova-api-0\" (UID: \"fe37850d-71e6-4310-9c74-b98b792cecc4\") " pod="openstack/nova-api-0" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.638317 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe37850d-71e6-4310-9c74-b98b792cecc4-logs\") pod \"nova-api-0\" (UID: \"fe37850d-71e6-4310-9c74-b98b792cecc4\") " pod="openstack/nova-api-0" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.638380 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe37850d-71e6-4310-9c74-b98b792cecc4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fe37850d-71e6-4310-9c74-b98b792cecc4\") " pod="openstack/nova-api-0" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.638397 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe37850d-71e6-4310-9c74-b98b792cecc4-config-data\") pod \"nova-api-0\" (UID: \"fe37850d-71e6-4310-9c74-b98b792cecc4\") " pod="openstack/nova-api-0" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.740421 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfdwv\" (UniqueName: \"kubernetes.io/projected/fe37850d-71e6-4310-9c74-b98b792cecc4-kube-api-access-zfdwv\") pod \"nova-api-0\" (UID: \"fe37850d-71e6-4310-9c74-b98b792cecc4\") " pod="openstack/nova-api-0" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.740512 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe37850d-71e6-4310-9c74-b98b792cecc4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fe37850d-71e6-4310-9c74-b98b792cecc4\") " pod="openstack/nova-api-0" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.740574 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe37850d-71e6-4310-9c74-b98b792cecc4-public-tls-certs\") pod \"nova-api-0\" (UID: \"fe37850d-71e6-4310-9c74-b98b792cecc4\") " pod="openstack/nova-api-0" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.740639 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe37850d-71e6-4310-9c74-b98b792cecc4-logs\") pod \"nova-api-0\" (UID: \"fe37850d-71e6-4310-9c74-b98b792cecc4\") " pod="openstack/nova-api-0" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.740764 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe37850d-71e6-4310-9c74-b98b792cecc4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fe37850d-71e6-4310-9c74-b98b792cecc4\") " pod="openstack/nova-api-0" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.740791 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe37850d-71e6-4310-9c74-b98b792cecc4-config-data\") pod \"nova-api-0\" (UID: \"fe37850d-71e6-4310-9c74-b98b792cecc4\") " pod="openstack/nova-api-0" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.741845 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe37850d-71e6-4310-9c74-b98b792cecc4-logs\") pod \"nova-api-0\" (UID: \"fe37850d-71e6-4310-9c74-b98b792cecc4\") " pod="openstack/nova-api-0" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.746491 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe37850d-71e6-4310-9c74-b98b792cecc4-config-data\") pod \"nova-api-0\" (UID: \"fe37850d-71e6-4310-9c74-b98b792cecc4\") " pod="openstack/nova-api-0" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.746458 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe37850d-71e6-4310-9c74-b98b792cecc4-public-tls-certs\") pod \"nova-api-0\" (UID: \"fe37850d-71e6-4310-9c74-b98b792cecc4\") " pod="openstack/nova-api-0" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.746726 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe37850d-71e6-4310-9c74-b98b792cecc4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fe37850d-71e6-4310-9c74-b98b792cecc4\") " pod="openstack/nova-api-0" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.747243 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe37850d-71e6-4310-9c74-b98b792cecc4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fe37850d-71e6-4310-9c74-b98b792cecc4\") " pod="openstack/nova-api-0" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.759750 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfdwv\" (UniqueName: \"kubernetes.io/projected/fe37850d-71e6-4310-9c74-b98b792cecc4-kube-api-access-zfdwv\") pod \"nova-api-0\" (UID: \"fe37850d-71e6-4310-9c74-b98b792cecc4\") " pod="openstack/nova-api-0" Dec 05 23:42:05 crc kubenswrapper[4734]: I1205 23:42:05.872876 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 23:42:06 crc kubenswrapper[4734]: I1205 23:42:06.379021 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 23:42:06 crc kubenswrapper[4734]: W1205 23:42:06.381014 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe37850d_71e6_4310_9c74_b98b792cecc4.slice/crio-43b03069594bfa645ace9538082074f4c354e2656ba765a89c84e4e95fe5877a WatchSource:0}: Error finding container 43b03069594bfa645ace9538082074f4c354e2656ba765a89c84e4e95fe5877a: Status 404 returned error can't find the container with id 43b03069594bfa645ace9538082074f4c354e2656ba765a89c84e4e95fe5877a Dec 05 23:42:06 crc kubenswrapper[4734]: I1205 23:42:06.434791 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fe37850d-71e6-4310-9c74-b98b792cecc4","Type":"ContainerStarted","Data":"43b03069594bfa645ace9538082074f4c354e2656ba765a89c84e4e95fe5877a"} Dec 05 23:42:06 crc kubenswrapper[4734]: I1205 23:42:06.436187 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"77364fbf-3dbe-45c3-adf1-94410f61f0ce","Type":"ContainerStarted","Data":"4dc9fe76caed286532f9e1902a206f9f53a17c28d49e8ca938733d2b0e256e5b"} Dec 05 23:42:06 crc kubenswrapper[4734]: I1205 23:42:06.436220 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"77364fbf-3dbe-45c3-adf1-94410f61f0ce","Type":"ContainerStarted","Data":"80b1554cb6b2b527352365dc3ea48b7b3e6fb0ed8c51d6e5c1bf1e45d58e40d7"} Dec 05 23:42:06 crc kubenswrapper[4734]: I1205 23:42:06.463051 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.463011775 podStartE2EDuration="2.463011775s" podCreationTimestamp="2025-12-05 23:42:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:42:06.454587171 +0000 UTC m=+1347.137991447" watchObservedRunningTime="2025-12-05 23:42:06.463011775 +0000 UTC m=+1347.146416051" Dec 05 23:42:07 crc kubenswrapper[4734]: I1205 23:42:07.459604 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fe37850d-71e6-4310-9c74-b98b792cecc4","Type":"ContainerStarted","Data":"ea2fde9de2a44519a3bcff56a0096699e91fb572f30c9303eac3ef853c93be5e"} Dec 05 23:42:07 crc kubenswrapper[4734]: I1205 23:42:07.459700 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fe37850d-71e6-4310-9c74-b98b792cecc4","Type":"ContainerStarted","Data":"a5ed4b1dd0228a8e87a386a0460d966ec7585473b072293c328fd16ccbe70227"} Dec 05 23:42:07 crc kubenswrapper[4734]: I1205 23:42:07.493789 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.493762904 podStartE2EDuration="2.493762904s" podCreationTimestamp="2025-12-05 23:42:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:42:07.484979421 +0000 UTC m=+1348.168383707" watchObservedRunningTime="2025-12-05 23:42:07.493762904 +0000 UTC m=+1348.177167180" Dec 05 23:42:09 crc kubenswrapper[4734]: I1205 23:42:09.194647 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 23:42:09 crc kubenswrapper[4734]: I1205 23:42:09.196059 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 23:42:09 crc kubenswrapper[4734]: I1205 23:42:09.855222 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 23:42:14 crc kubenswrapper[4734]: I1205 23:42:14.194156 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 23:42:14 crc kubenswrapper[4734]: I1205 23:42:14.195133 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 23:42:14 crc kubenswrapper[4734]: I1205 23:42:14.855439 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 23:42:14 crc kubenswrapper[4734]: I1205 23:42:14.888016 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 23:42:15 crc kubenswrapper[4734]: I1205 23:42:15.210789 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6929b8c5-4cb9-49cd-a084-d578657ce0bf" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 23:42:15 crc kubenswrapper[4734]: I1205 23:42:15.210841 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6929b8c5-4cb9-49cd-a084-d578657ce0bf" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 23:42:15 crc kubenswrapper[4734]: I1205 23:42:15.579586 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 23:42:15 crc kubenswrapper[4734]: I1205 23:42:15.873486 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 23:42:15 crc kubenswrapper[4734]: I1205 23:42:15.873609 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 23:42:16 crc kubenswrapper[4734]: I1205 23:42:16.623080 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 05 23:42:16 crc kubenswrapper[4734]: I1205 23:42:16.889765 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fe37850d-71e6-4310-9c74-b98b792cecc4" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 23:42:16 crc kubenswrapper[4734]: I1205 23:42:16.889756 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fe37850d-71e6-4310-9c74-b98b792cecc4" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 23:42:20 crc kubenswrapper[4734]: I1205 23:42:20.445607 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:42:20 crc kubenswrapper[4734]: I1205 23:42:20.446660 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:42:24 crc kubenswrapper[4734]: I1205 23:42:24.204975 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 23:42:24 crc kubenswrapper[4734]: I1205 23:42:24.209133 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 23:42:24 crc kubenswrapper[4734]: I1205 23:42:24.214666 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 23:42:24 crc kubenswrapper[4734]: I1205 23:42:24.409797 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7tc9d"] Dec 05 23:42:24 crc kubenswrapper[4734]: I1205 23:42:24.413718 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7tc9d" Dec 05 23:42:24 crc kubenswrapper[4734]: I1205 23:42:24.449320 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7tc9d"] Dec 05 23:42:24 crc kubenswrapper[4734]: I1205 23:42:24.608632 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txck6\" (UniqueName: \"kubernetes.io/projected/1897868f-e014-4ddd-b906-6240a4e00ec3-kube-api-access-txck6\") pod \"redhat-operators-7tc9d\" (UID: \"1897868f-e014-4ddd-b906-6240a4e00ec3\") " pod="openshift-marketplace/redhat-operators-7tc9d" Dec 05 23:42:24 crc kubenswrapper[4734]: I1205 23:42:24.608689 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1897868f-e014-4ddd-b906-6240a4e00ec3-utilities\") pod \"redhat-operators-7tc9d\" (UID: \"1897868f-e014-4ddd-b906-6240a4e00ec3\") " pod="openshift-marketplace/redhat-operators-7tc9d" Dec 05 23:42:24 crc kubenswrapper[4734]: I1205 23:42:24.608765 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1897868f-e014-4ddd-b906-6240a4e00ec3-catalog-content\") pod \"redhat-operators-7tc9d\" (UID: \"1897868f-e014-4ddd-b906-6240a4e00ec3\") " pod="openshift-marketplace/redhat-operators-7tc9d" Dec 05 23:42:24 crc kubenswrapper[4734]: I1205 23:42:24.653201 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 23:42:24 crc kubenswrapper[4734]: I1205 23:42:24.711287 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txck6\" (UniqueName: \"kubernetes.io/projected/1897868f-e014-4ddd-b906-6240a4e00ec3-kube-api-access-txck6\") pod \"redhat-operators-7tc9d\" (UID: \"1897868f-e014-4ddd-b906-6240a4e00ec3\") " pod="openshift-marketplace/redhat-operators-7tc9d" Dec 05 23:42:24 crc kubenswrapper[4734]: I1205 23:42:24.711344 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1897868f-e014-4ddd-b906-6240a4e00ec3-utilities\") pod \"redhat-operators-7tc9d\" (UID: \"1897868f-e014-4ddd-b906-6240a4e00ec3\") " pod="openshift-marketplace/redhat-operators-7tc9d" Dec 05 23:42:24 crc kubenswrapper[4734]: I1205 23:42:24.711418 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1897868f-e014-4ddd-b906-6240a4e00ec3-catalog-content\") pod \"redhat-operators-7tc9d\" (UID: \"1897868f-e014-4ddd-b906-6240a4e00ec3\") " pod="openshift-marketplace/redhat-operators-7tc9d" Dec 05 23:42:24 crc kubenswrapper[4734]: I1205 23:42:24.712020 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1897868f-e014-4ddd-b906-6240a4e00ec3-utilities\") pod \"redhat-operators-7tc9d\" (UID: \"1897868f-e014-4ddd-b906-6240a4e00ec3\") " pod="openshift-marketplace/redhat-operators-7tc9d" Dec 05 23:42:24 crc kubenswrapper[4734]: I1205 23:42:24.712138 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1897868f-e014-4ddd-b906-6240a4e00ec3-catalog-content\") pod \"redhat-operators-7tc9d\" (UID: \"1897868f-e014-4ddd-b906-6240a4e00ec3\") " pod="openshift-marketplace/redhat-operators-7tc9d" Dec 05 23:42:24 crc kubenswrapper[4734]: I1205 23:42:24.752757 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txck6\" (UniqueName: \"kubernetes.io/projected/1897868f-e014-4ddd-b906-6240a4e00ec3-kube-api-access-txck6\") pod \"redhat-operators-7tc9d\" (UID: \"1897868f-e014-4ddd-b906-6240a4e00ec3\") " pod="openshift-marketplace/redhat-operators-7tc9d" Dec 05 23:42:24 crc kubenswrapper[4734]: I1205 23:42:24.753675 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7tc9d" Dec 05 23:42:25 crc kubenswrapper[4734]: I1205 23:42:25.350868 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7tc9d"] Dec 05 23:42:25 crc kubenswrapper[4734]: W1205 23:42:25.354175 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1897868f_e014_4ddd_b906_6240a4e00ec3.slice/crio-1a52aa99d6f61f06fbc042462468c5799409c096a03d3ed6b116f6b2ea93ca86 WatchSource:0}: Error finding container 1a52aa99d6f61f06fbc042462468c5799409c096a03d3ed6b116f6b2ea93ca86: Status 404 returned error can't find the container with id 1a52aa99d6f61f06fbc042462468c5799409c096a03d3ed6b116f6b2ea93ca86 Dec 05 23:42:25 crc kubenswrapper[4734]: I1205 23:42:25.674626 4734 generic.go:334] "Generic (PLEG): container finished" podID="1897868f-e014-4ddd-b906-6240a4e00ec3" containerID="8448122e50558e714451fb59ffa00c32fcad0d596b7a60f733f3989457bb19c0" exitCode=0 Dec 05 23:42:25 crc kubenswrapper[4734]: I1205 23:42:25.674687 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tc9d" event={"ID":"1897868f-e014-4ddd-b906-6240a4e00ec3","Type":"ContainerDied","Data":"8448122e50558e714451fb59ffa00c32fcad0d596b7a60f733f3989457bb19c0"} Dec 05 23:42:25 crc kubenswrapper[4734]: I1205 23:42:25.675391 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tc9d" event={"ID":"1897868f-e014-4ddd-b906-6240a4e00ec3","Type":"ContainerStarted","Data":"1a52aa99d6f61f06fbc042462468c5799409c096a03d3ed6b116f6b2ea93ca86"} Dec 05 23:42:25 crc kubenswrapper[4734]: I1205 23:42:25.883951 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 23:42:25 crc kubenswrapper[4734]: I1205 23:42:25.884824 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 23:42:25 crc kubenswrapper[4734]: I1205 23:42:25.887141 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 23:42:25 crc kubenswrapper[4734]: I1205 23:42:25.891384 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 23:42:26 crc kubenswrapper[4734]: I1205 23:42:26.705101 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tc9d" event={"ID":"1897868f-e014-4ddd-b906-6240a4e00ec3","Type":"ContainerStarted","Data":"0ee9fb36c6fb2261b50c5550aa1799efacc6c315c6f96c821ad8dff8fa2febd6"} Dec 05 23:42:26 crc kubenswrapper[4734]: I1205 23:42:26.706156 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 23:42:26 crc kubenswrapper[4734]: I1205 23:42:26.717039 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 23:42:28 crc kubenswrapper[4734]: I1205 23:42:28.727252 4734 generic.go:334] "Generic (PLEG): container finished" podID="1897868f-e014-4ddd-b906-6240a4e00ec3" containerID="0ee9fb36c6fb2261b50c5550aa1799efacc6c315c6f96c821ad8dff8fa2febd6" exitCode=0 Dec 05 23:42:28 crc kubenswrapper[4734]: I1205 23:42:28.727306 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tc9d" event={"ID":"1897868f-e014-4ddd-b906-6240a4e00ec3","Type":"ContainerDied","Data":"0ee9fb36c6fb2261b50c5550aa1799efacc6c315c6f96c821ad8dff8fa2febd6"} Dec 05 23:42:30 crc kubenswrapper[4734]: I1205 23:42:30.751589 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tc9d" event={"ID":"1897868f-e014-4ddd-b906-6240a4e00ec3","Type":"ContainerStarted","Data":"e8671952cce05404914c45d11356827fc54cb58452bd1f4dfccf0e84b4d92333"} Dec 05 23:42:30 crc kubenswrapper[4734]: I1205 23:42:30.781035 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7tc9d" podStartSLOduration=3.193853513 podStartE2EDuration="6.781000484s" podCreationTimestamp="2025-12-05 23:42:24 +0000 UTC" firstStartedPulling="2025-12-05 23:42:25.679760091 +0000 UTC m=+1366.363164367" lastFinishedPulling="2025-12-05 23:42:29.266907062 +0000 UTC m=+1369.950311338" observedRunningTime="2025-12-05 23:42:30.773307877 +0000 UTC m=+1371.456712173" watchObservedRunningTime="2025-12-05 23:42:30.781000484 +0000 UTC m=+1371.464404760" Dec 05 23:42:34 crc kubenswrapper[4734]: I1205 23:42:34.754866 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7tc9d" Dec 05 23:42:34 crc kubenswrapper[4734]: I1205 23:42:34.755878 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7tc9d" Dec 05 23:42:35 crc kubenswrapper[4734]: I1205 23:42:35.155040 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 23:42:35 crc kubenswrapper[4734]: I1205 23:42:35.824719 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7tc9d" podUID="1897868f-e014-4ddd-b906-6240a4e00ec3" containerName="registry-server" probeResult="failure" output=< Dec 05 23:42:35 crc kubenswrapper[4734]: timeout: failed to connect service ":50051" within 1s Dec 05 23:42:35 crc kubenswrapper[4734]: > Dec 05 23:42:36 crc kubenswrapper[4734]: I1205 23:42:36.555265 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 23:42:40 crc kubenswrapper[4734]: I1205 23:42:40.366258 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="c35eaa12-d993-4769-975b-35a5ac6609e0" containerName="rabbitmq" containerID="cri-o://05f9dd5d16e65b3b556d3d6b976b02a62a361e9e97c167b7525d95c4d23713ac" gracePeriod=604795 Dec 05 23:42:41 crc kubenswrapper[4734]: I1205 23:42:41.085312 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="c35eaa12-d993-4769-975b-35a5ac6609e0" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Dec 05 23:42:41 crc kubenswrapper[4734]: I1205 23:42:41.342897 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="ed95027c-1ded-4127-a341-7ee81018d4b6" containerName="rabbitmq" containerID="cri-o://e568c4fef629437d46890910627d0796cf4be63170bfe6ba198f3522a7208650" gracePeriod=604796 Dec 05 23:42:41 crc kubenswrapper[4734]: I1205 23:42:41.448421 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="ed95027c-1ded-4127-a341-7ee81018d4b6" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Dec 05 23:42:44 crc kubenswrapper[4734]: I1205 23:42:44.844314 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7tc9d" Dec 05 23:42:44 crc kubenswrapper[4734]: I1205 23:42:44.984705 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7tc9d" Dec 05 23:42:45 crc kubenswrapper[4734]: I1205 23:42:45.121474 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7tc9d"] Dec 05 23:42:45 crc kubenswrapper[4734]: I1205 23:42:45.988056 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7tc9d" podUID="1897868f-e014-4ddd-b906-6240a4e00ec3" containerName="registry-server" containerID="cri-o://e8671952cce05404914c45d11356827fc54cb58452bd1f4dfccf0e84b4d92333" gracePeriod=2 Dec 05 23:42:46 crc kubenswrapper[4734]: I1205 23:42:46.476922 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7tc9d" Dec 05 23:42:46 crc kubenswrapper[4734]: I1205 23:42:46.564537 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1897868f-e014-4ddd-b906-6240a4e00ec3-utilities\") pod \"1897868f-e014-4ddd-b906-6240a4e00ec3\" (UID: \"1897868f-e014-4ddd-b906-6240a4e00ec3\") " Dec 05 23:42:46 crc kubenswrapper[4734]: I1205 23:42:46.564685 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txck6\" (UniqueName: \"kubernetes.io/projected/1897868f-e014-4ddd-b906-6240a4e00ec3-kube-api-access-txck6\") pod \"1897868f-e014-4ddd-b906-6240a4e00ec3\" (UID: \"1897868f-e014-4ddd-b906-6240a4e00ec3\") " Dec 05 23:42:46 crc kubenswrapper[4734]: I1205 23:42:46.564896 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1897868f-e014-4ddd-b906-6240a4e00ec3-catalog-content\") pod \"1897868f-e014-4ddd-b906-6240a4e00ec3\" (UID: \"1897868f-e014-4ddd-b906-6240a4e00ec3\") " Dec 05 23:42:46 crc kubenswrapper[4734]: I1205 23:42:46.565342 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1897868f-e014-4ddd-b906-6240a4e00ec3-utilities" (OuterVolumeSpecName: "utilities") pod "1897868f-e014-4ddd-b906-6240a4e00ec3" (UID: "1897868f-e014-4ddd-b906-6240a4e00ec3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:42:46 crc kubenswrapper[4734]: I1205 23:42:46.575789 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1897868f-e014-4ddd-b906-6240a4e00ec3-kube-api-access-txck6" (OuterVolumeSpecName: "kube-api-access-txck6") pod "1897868f-e014-4ddd-b906-6240a4e00ec3" (UID: "1897868f-e014-4ddd-b906-6240a4e00ec3"). InnerVolumeSpecName "kube-api-access-txck6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:42:46 crc kubenswrapper[4734]: I1205 23:42:46.670935 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1897868f-e014-4ddd-b906-6240a4e00ec3-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:46 crc kubenswrapper[4734]: I1205 23:42:46.670991 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txck6\" (UniqueName: \"kubernetes.io/projected/1897868f-e014-4ddd-b906-6240a4e00ec3-kube-api-access-txck6\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:46 crc kubenswrapper[4734]: I1205 23:42:46.706326 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1897868f-e014-4ddd-b906-6240a4e00ec3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1897868f-e014-4ddd-b906-6240a4e00ec3" (UID: "1897868f-e014-4ddd-b906-6240a4e00ec3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:42:46 crc kubenswrapper[4734]: I1205 23:42:46.774514 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1897868f-e014-4ddd-b906-6240a4e00ec3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:46 crc kubenswrapper[4734]: I1205 23:42:46.990321 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.009278 4734 generic.go:334] "Generic (PLEG): container finished" podID="1897868f-e014-4ddd-b906-6240a4e00ec3" containerID="e8671952cce05404914c45d11356827fc54cb58452bd1f4dfccf0e84b4d92333" exitCode=0 Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.009374 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tc9d" event={"ID":"1897868f-e014-4ddd-b906-6240a4e00ec3","Type":"ContainerDied","Data":"e8671952cce05404914c45d11356827fc54cb58452bd1f4dfccf0e84b4d92333"} Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.009419 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7tc9d" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.009469 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tc9d" event={"ID":"1897868f-e014-4ddd-b906-6240a4e00ec3","Type":"ContainerDied","Data":"1a52aa99d6f61f06fbc042462468c5799409c096a03d3ed6b116f6b2ea93ca86"} Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.009498 4734 scope.go:117] "RemoveContainer" containerID="e8671952cce05404914c45d11356827fc54cb58452bd1f4dfccf0e84b4d92333" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.012963 4734 generic.go:334] "Generic (PLEG): container finished" podID="c35eaa12-d993-4769-975b-35a5ac6609e0" containerID="05f9dd5d16e65b3b556d3d6b976b02a62a361e9e97c167b7525d95c4d23713ac" exitCode=0 Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.013007 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c35eaa12-d993-4769-975b-35a5ac6609e0","Type":"ContainerDied","Data":"05f9dd5d16e65b3b556d3d6b976b02a62a361e9e97c167b7525d95c4d23713ac"} Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.013045 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c35eaa12-d993-4769-975b-35a5ac6609e0","Type":"ContainerDied","Data":"5463b83dd1e6a729a45f7c14d4b26f8d563230a8dbeb04f3da73d157f40ffe7c"} Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.013133 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.053012 4734 scope.go:117] "RemoveContainer" containerID="0ee9fb36c6fb2261b50c5550aa1799efacc6c315c6f96c821ad8dff8fa2febd6" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.080082 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c35eaa12-d993-4769-975b-35a5ac6609e0-pod-info\") pod \"c35eaa12-d993-4769-975b-35a5ac6609e0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.080166 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c35eaa12-d993-4769-975b-35a5ac6609e0-server-conf\") pod \"c35eaa12-d993-4769-975b-35a5ac6609e0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.080186 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c35eaa12-d993-4769-975b-35a5ac6609e0-config-data\") pod \"c35eaa12-d993-4769-975b-35a5ac6609e0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.080299 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"c35eaa12-d993-4769-975b-35a5ac6609e0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.080340 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c35eaa12-d993-4769-975b-35a5ac6609e0-rabbitmq-plugins\") pod \"c35eaa12-d993-4769-975b-35a5ac6609e0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.080376 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c35eaa12-d993-4769-975b-35a5ac6609e0-rabbitmq-confd\") pod \"c35eaa12-d993-4769-975b-35a5ac6609e0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.080414 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c35eaa12-d993-4769-975b-35a5ac6609e0-plugins-conf\") pod \"c35eaa12-d993-4769-975b-35a5ac6609e0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.080451 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvnnn\" (UniqueName: \"kubernetes.io/projected/c35eaa12-d993-4769-975b-35a5ac6609e0-kube-api-access-rvnnn\") pod \"c35eaa12-d993-4769-975b-35a5ac6609e0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.080517 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c35eaa12-d993-4769-975b-35a5ac6609e0-rabbitmq-tls\") pod \"c35eaa12-d993-4769-975b-35a5ac6609e0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.080619 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c35eaa12-d993-4769-975b-35a5ac6609e0-erlang-cookie-secret\") pod \"c35eaa12-d993-4769-975b-35a5ac6609e0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.080706 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c35eaa12-d993-4769-975b-35a5ac6609e0-rabbitmq-erlang-cookie\") pod \"c35eaa12-d993-4769-975b-35a5ac6609e0\" (UID: \"c35eaa12-d993-4769-975b-35a5ac6609e0\") " Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.082040 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c35eaa12-d993-4769-975b-35a5ac6609e0-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c35eaa12-d993-4769-975b-35a5ac6609e0" (UID: "c35eaa12-d993-4769-975b-35a5ac6609e0"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.082706 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c35eaa12-d993-4769-975b-35a5ac6609e0-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c35eaa12-d993-4769-975b-35a5ac6609e0" (UID: "c35eaa12-d993-4769-975b-35a5ac6609e0"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.090341 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c35eaa12-d993-4769-975b-35a5ac6609e0-kube-api-access-rvnnn" (OuterVolumeSpecName: "kube-api-access-rvnnn") pod "c35eaa12-d993-4769-975b-35a5ac6609e0" (UID: "c35eaa12-d993-4769-975b-35a5ac6609e0"). InnerVolumeSpecName "kube-api-access-rvnnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.096397 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c35eaa12-d993-4769-975b-35a5ac6609e0-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c35eaa12-d993-4769-975b-35a5ac6609e0" (UID: "c35eaa12-d993-4769-975b-35a5ac6609e0"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.097757 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c35eaa12-d993-4769-975b-35a5ac6609e0-pod-info" (OuterVolumeSpecName: "pod-info") pod "c35eaa12-d993-4769-975b-35a5ac6609e0" (UID: "c35eaa12-d993-4769-975b-35a5ac6609e0"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.107303 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "c35eaa12-d993-4769-975b-35a5ac6609e0" (UID: "c35eaa12-d993-4769-975b-35a5ac6609e0"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.106603 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c35eaa12-d993-4769-975b-35a5ac6609e0-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c35eaa12-d993-4769-975b-35a5ac6609e0" (UID: "c35eaa12-d993-4769-975b-35a5ac6609e0"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.108305 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c35eaa12-d993-4769-975b-35a5ac6609e0-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c35eaa12-d993-4769-975b-35a5ac6609e0" (UID: "c35eaa12-d993-4769-975b-35a5ac6609e0"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.118008 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7tc9d"] Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.136144 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7tc9d"] Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.161277 4734 scope.go:117] "RemoveContainer" containerID="8448122e50558e714451fb59ffa00c32fcad0d596b7a60f733f3989457bb19c0" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.188073 4734 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c35eaa12-d993-4769-975b-35a5ac6609e0-pod-info\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.188153 4734 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.188167 4734 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c35eaa12-d993-4769-975b-35a5ac6609e0-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.189078 4734 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c35eaa12-d993-4769-975b-35a5ac6609e0-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.189100 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvnnn\" (UniqueName: \"kubernetes.io/projected/c35eaa12-d993-4769-975b-35a5ac6609e0-kube-api-access-rvnnn\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.189132 4734 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c35eaa12-d993-4769-975b-35a5ac6609e0-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.189144 4734 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c35eaa12-d993-4769-975b-35a5ac6609e0-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.189154 4734 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c35eaa12-d993-4769-975b-35a5ac6609e0-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.208995 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c35eaa12-d993-4769-975b-35a5ac6609e0-server-conf" (OuterVolumeSpecName: "server-conf") pod "c35eaa12-d993-4769-975b-35a5ac6609e0" (UID: "c35eaa12-d993-4769-975b-35a5ac6609e0"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.219921 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c35eaa12-d993-4769-975b-35a5ac6609e0-config-data" (OuterVolumeSpecName: "config-data") pod "c35eaa12-d993-4769-975b-35a5ac6609e0" (UID: "c35eaa12-d993-4769-975b-35a5ac6609e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.226359 4734 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.292253 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c35eaa12-d993-4769-975b-35a5ac6609e0-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.292301 4734 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.292313 4734 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c35eaa12-d993-4769-975b-35a5ac6609e0-server-conf\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.327787 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c35eaa12-d993-4769-975b-35a5ac6609e0-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c35eaa12-d993-4769-975b-35a5ac6609e0" (UID: "c35eaa12-d993-4769-975b-35a5ac6609e0"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.394202 4734 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c35eaa12-d993-4769-975b-35a5ac6609e0-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.433407 4734 scope.go:117] "RemoveContainer" containerID="e8671952cce05404914c45d11356827fc54cb58452bd1f4dfccf0e84b4d92333" Dec 05 23:42:47 crc kubenswrapper[4734]: E1205 23:42:47.434184 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8671952cce05404914c45d11356827fc54cb58452bd1f4dfccf0e84b4d92333\": container with ID starting with e8671952cce05404914c45d11356827fc54cb58452bd1f4dfccf0e84b4d92333 not found: ID does not exist" containerID="e8671952cce05404914c45d11356827fc54cb58452bd1f4dfccf0e84b4d92333" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.434238 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8671952cce05404914c45d11356827fc54cb58452bd1f4dfccf0e84b4d92333"} err="failed to get container status \"e8671952cce05404914c45d11356827fc54cb58452bd1f4dfccf0e84b4d92333\": rpc error: code = NotFound desc = could not find container \"e8671952cce05404914c45d11356827fc54cb58452bd1f4dfccf0e84b4d92333\": container with ID starting with e8671952cce05404914c45d11356827fc54cb58452bd1f4dfccf0e84b4d92333 not found: ID does not exist" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.434268 4734 scope.go:117] "RemoveContainer" containerID="0ee9fb36c6fb2261b50c5550aa1799efacc6c315c6f96c821ad8dff8fa2febd6" Dec 05 23:42:47 crc kubenswrapper[4734]: E1205 23:42:47.434830 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ee9fb36c6fb2261b50c5550aa1799efacc6c315c6f96c821ad8dff8fa2febd6\": container with ID starting with 0ee9fb36c6fb2261b50c5550aa1799efacc6c315c6f96c821ad8dff8fa2febd6 not found: ID does not exist" containerID="0ee9fb36c6fb2261b50c5550aa1799efacc6c315c6f96c821ad8dff8fa2febd6" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.434905 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ee9fb36c6fb2261b50c5550aa1799efacc6c315c6f96c821ad8dff8fa2febd6"} err="failed to get container status \"0ee9fb36c6fb2261b50c5550aa1799efacc6c315c6f96c821ad8dff8fa2febd6\": rpc error: code = NotFound desc = could not find container \"0ee9fb36c6fb2261b50c5550aa1799efacc6c315c6f96c821ad8dff8fa2febd6\": container with ID starting with 0ee9fb36c6fb2261b50c5550aa1799efacc6c315c6f96c821ad8dff8fa2febd6 not found: ID does not exist" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.434942 4734 scope.go:117] "RemoveContainer" containerID="8448122e50558e714451fb59ffa00c32fcad0d596b7a60f733f3989457bb19c0" Dec 05 23:42:47 crc kubenswrapper[4734]: E1205 23:42:47.436266 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8448122e50558e714451fb59ffa00c32fcad0d596b7a60f733f3989457bb19c0\": container with ID starting with 8448122e50558e714451fb59ffa00c32fcad0d596b7a60f733f3989457bb19c0 not found: ID does not exist" containerID="8448122e50558e714451fb59ffa00c32fcad0d596b7a60f733f3989457bb19c0" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.436325 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8448122e50558e714451fb59ffa00c32fcad0d596b7a60f733f3989457bb19c0"} err="failed to get container status \"8448122e50558e714451fb59ffa00c32fcad0d596b7a60f733f3989457bb19c0\": rpc error: code = NotFound desc = could not find container \"8448122e50558e714451fb59ffa00c32fcad0d596b7a60f733f3989457bb19c0\": container with ID starting with 8448122e50558e714451fb59ffa00c32fcad0d596b7a60f733f3989457bb19c0 not found: ID does not exist" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.436343 4734 scope.go:117] "RemoveContainer" containerID="05f9dd5d16e65b3b556d3d6b976b02a62a361e9e97c167b7525d95c4d23713ac" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.468610 4734 scope.go:117] "RemoveContainer" containerID="c5950e0647bc09668aa5bb962f9d7316a20f3b0c1e9e76366f98c21ef1804c9e" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.499849 4734 scope.go:117] "RemoveContainer" containerID="05f9dd5d16e65b3b556d3d6b976b02a62a361e9e97c167b7525d95c4d23713ac" Dec 05 23:42:47 crc kubenswrapper[4734]: E1205 23:42:47.500509 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05f9dd5d16e65b3b556d3d6b976b02a62a361e9e97c167b7525d95c4d23713ac\": container with ID starting with 05f9dd5d16e65b3b556d3d6b976b02a62a361e9e97c167b7525d95c4d23713ac not found: ID does not exist" containerID="05f9dd5d16e65b3b556d3d6b976b02a62a361e9e97c167b7525d95c4d23713ac" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.500565 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05f9dd5d16e65b3b556d3d6b976b02a62a361e9e97c167b7525d95c4d23713ac"} err="failed to get container status \"05f9dd5d16e65b3b556d3d6b976b02a62a361e9e97c167b7525d95c4d23713ac\": rpc error: code = NotFound desc = could not find container \"05f9dd5d16e65b3b556d3d6b976b02a62a361e9e97c167b7525d95c4d23713ac\": container with ID starting with 05f9dd5d16e65b3b556d3d6b976b02a62a361e9e97c167b7525d95c4d23713ac not found: ID does not exist" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.500594 4734 scope.go:117] "RemoveContainer" containerID="c5950e0647bc09668aa5bb962f9d7316a20f3b0c1e9e76366f98c21ef1804c9e" Dec 05 23:42:47 crc kubenswrapper[4734]: E1205 23:42:47.501077 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5950e0647bc09668aa5bb962f9d7316a20f3b0c1e9e76366f98c21ef1804c9e\": container with ID starting with c5950e0647bc09668aa5bb962f9d7316a20f3b0c1e9e76366f98c21ef1804c9e not found: ID does not exist" containerID="c5950e0647bc09668aa5bb962f9d7316a20f3b0c1e9e76366f98c21ef1804c9e" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.501139 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5950e0647bc09668aa5bb962f9d7316a20f3b0c1e9e76366f98c21ef1804c9e"} err="failed to get container status \"c5950e0647bc09668aa5bb962f9d7316a20f3b0c1e9e76366f98c21ef1804c9e\": rpc error: code = NotFound desc = could not find container \"c5950e0647bc09668aa5bb962f9d7316a20f3b0c1e9e76366f98c21ef1804c9e\": container with ID starting with c5950e0647bc09668aa5bb962f9d7316a20f3b0c1e9e76366f98c21ef1804c9e not found: ID does not exist" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.632663 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1897868f-e014-4ddd-b906-6240a4e00ec3" path="/var/lib/kubelet/pods/1897868f-e014-4ddd-b906-6240a4e00ec3/volumes" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.681454 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.702768 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.717410 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-49zd8"] Dec 05 23:42:47 crc kubenswrapper[4734]: E1205 23:42:47.717995 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1897868f-e014-4ddd-b906-6240a4e00ec3" containerName="registry-server" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.718015 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="1897868f-e014-4ddd-b906-6240a4e00ec3" containerName="registry-server" Dec 05 23:42:47 crc kubenswrapper[4734]: E1205 23:42:47.718040 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c35eaa12-d993-4769-975b-35a5ac6609e0" containerName="setup-container" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.718048 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="c35eaa12-d993-4769-975b-35a5ac6609e0" containerName="setup-container" Dec 05 23:42:47 crc kubenswrapper[4734]: E1205 23:42:47.718061 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c35eaa12-d993-4769-975b-35a5ac6609e0" containerName="rabbitmq" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.718075 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="c35eaa12-d993-4769-975b-35a5ac6609e0" containerName="rabbitmq" Dec 05 23:42:47 crc kubenswrapper[4734]: E1205 23:42:47.718095 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1897868f-e014-4ddd-b906-6240a4e00ec3" containerName="extract-utilities" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.718104 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="1897868f-e014-4ddd-b906-6240a4e00ec3" containerName="extract-utilities" Dec 05 23:42:47 crc kubenswrapper[4734]: E1205 23:42:47.718119 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1897868f-e014-4ddd-b906-6240a4e00ec3" containerName="extract-content" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.718127 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="1897868f-e014-4ddd-b906-6240a4e00ec3" containerName="extract-content" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.718354 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="1897868f-e014-4ddd-b906-6240a4e00ec3" containerName="registry-server" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.718371 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="c35eaa12-d993-4769-975b-35a5ac6609e0" containerName="rabbitmq" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.719796 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-49zd8" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.726715 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.733127 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-49zd8"] Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.774587 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.776670 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.781838 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.782900 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.782237 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-7thdb" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.782333 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.783170 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.783219 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.793910 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.825569 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-49zd8\" (UID: \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-49zd8" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.835072 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-49zd8\" (UID: \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-49zd8" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.835400 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-49zd8\" (UID: \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-49zd8" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.835460 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-49zd8\" (UID: \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-49zd8" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.835517 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-49zd8\" (UID: \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-49zd8" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.836675 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pnwk\" (UniqueName: \"kubernetes.io/projected/52ffd7e9-8c09-43d2-b7dd-909a39e83051-kube-api-access-8pnwk\") pod \"dnsmasq-dns-79bd4cc8c9-49zd8\" (UID: \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-49zd8" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.836849 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-config\") pod \"dnsmasq-dns-79bd4cc8c9-49zd8\" (UID: \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-49zd8" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.907229 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.942163 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-49zd8\" (UID: \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-49zd8" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.942226 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/556dbce3-075c-473a-ab0d-ea67ffc3e144-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.942249 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-49zd8\" (UID: \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-49zd8" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.942276 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/556dbce3-075c-473a-ab0d-ea67ffc3e144-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.942297 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-49zd8\" (UID: \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-49zd8" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.942312 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/556dbce3-075c-473a-ab0d-ea67ffc3e144-config-data\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.942332 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.942356 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/556dbce3-075c-473a-ab0d-ea67ffc3e144-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.942373 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/556dbce3-075c-473a-ab0d-ea67ffc3e144-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.942402 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pnwk\" (UniqueName: \"kubernetes.io/projected/52ffd7e9-8c09-43d2-b7dd-909a39e83051-kube-api-access-8pnwk\") pod \"dnsmasq-dns-79bd4cc8c9-49zd8\" (UID: \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-49zd8" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.942447 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/556dbce3-075c-473a-ab0d-ea67ffc3e144-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.942469 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-config\") pod \"dnsmasq-dns-79bd4cc8c9-49zd8\" (UID: \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-49zd8" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.942507 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2fxn\" (UniqueName: \"kubernetes.io/projected/556dbce3-075c-473a-ab0d-ea67ffc3e144-kube-api-access-t2fxn\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.942641 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-49zd8\" (UID: \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-49zd8" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.942668 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/556dbce3-075c-473a-ab0d-ea67ffc3e144-server-conf\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.942704 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-49zd8\" (UID: \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-49zd8" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.942748 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/556dbce3-075c-473a-ab0d-ea67ffc3e144-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.942784 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/556dbce3-075c-473a-ab0d-ea67ffc3e144-pod-info\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.944298 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-49zd8\" (UID: \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-49zd8" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.945008 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-49zd8\" (UID: \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-49zd8" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.946144 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-49zd8\" (UID: \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-49zd8" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.949368 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-49zd8\" (UID: \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-49zd8" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.949456 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-config\") pod \"dnsmasq-dns-79bd4cc8c9-49zd8\" (UID: \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-49zd8" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.955867 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-49zd8\" (UID: \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-49zd8" Dec 05 23:42:47 crc kubenswrapper[4734]: I1205 23:42:47.969245 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-49zd8"] Dec 05 23:42:47 crc kubenswrapper[4734]: E1205 23:42:47.972259 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-8pnwk], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-79bd4cc8c9-49zd8" podUID="52ffd7e9-8c09-43d2-b7dd-909a39e83051" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.002116 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55478c4467-p7nf9"] Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.004661 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-p7nf9" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.009461 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pnwk\" (UniqueName: \"kubernetes.io/projected/52ffd7e9-8c09-43d2-b7dd-909a39e83051-kube-api-access-8pnwk\") pod \"dnsmasq-dns-79bd4cc8c9-49zd8\" (UID: \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-49zd8" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.021221 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-p7nf9"] Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.029449 4734 generic.go:334] "Generic (PLEG): container finished" podID="ed95027c-1ded-4127-a341-7ee81018d4b6" containerID="e568c4fef629437d46890910627d0796cf4be63170bfe6ba198f3522a7208650" exitCode=0 Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.029611 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ed95027c-1ded-4127-a341-7ee81018d4b6","Type":"ContainerDied","Data":"e568c4fef629437d46890910627d0796cf4be63170bfe6ba198f3522a7208650"} Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.044302 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-49zd8" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.051370 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/556dbce3-075c-473a-ab0d-ea67ffc3e144-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.051489 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/556dbce3-075c-473a-ab0d-ea67ffc3e144-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.051557 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/556dbce3-075c-473a-ab0d-ea67ffc3e144-config-data\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.051589 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.054009 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/556dbce3-075c-473a-ab0d-ea67ffc3e144-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.054054 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/556dbce3-075c-473a-ab0d-ea67ffc3e144-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.054205 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/556dbce3-075c-473a-ab0d-ea67ffc3e144-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.053046 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/556dbce3-075c-473a-ab0d-ea67ffc3e144-config-data\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.052558 4734 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-server-0" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.056707 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2fxn\" (UniqueName: \"kubernetes.io/projected/556dbce3-075c-473a-ab0d-ea67ffc3e144-kube-api-access-t2fxn\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.056815 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/556dbce3-075c-473a-ab0d-ea67ffc3e144-server-conf\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.057455 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/556dbce3-075c-473a-ab0d-ea67ffc3e144-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.057503 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/556dbce3-075c-473a-ab0d-ea67ffc3e144-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.057510 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/556dbce3-075c-473a-ab0d-ea67ffc3e144-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.057589 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/556dbce3-075c-473a-ab0d-ea67ffc3e144-pod-info\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.057788 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/556dbce3-075c-473a-ab0d-ea67ffc3e144-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.058690 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/556dbce3-075c-473a-ab0d-ea67ffc3e144-server-conf\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.063299 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/556dbce3-075c-473a-ab0d-ea67ffc3e144-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.064677 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/556dbce3-075c-473a-ab0d-ea67ffc3e144-pod-info\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.064962 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/556dbce3-075c-473a-ab0d-ea67ffc3e144-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.069799 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-49zd8" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.080660 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/556dbce3-075c-473a-ab0d-ea67ffc3e144-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.085560 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2fxn\" (UniqueName: \"kubernetes.io/projected/556dbce3-075c-473a-ab0d-ea67ffc3e144-kube-api-access-t2fxn\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.092654 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.097731 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"556dbce3-075c-473a-ab0d-ea67ffc3e144\") " pod="openstack/rabbitmq-server-0" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.138204 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.159711 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-dns-svc\") pod \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\" (UID: \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\") " Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.159769 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ed95027c-1ded-4127-a341-7ee81018d4b6-rabbitmq-confd\") pod \"ed95027c-1ded-4127-a341-7ee81018d4b6\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.159815 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed95027c-1ded-4127-a341-7ee81018d4b6-config-data\") pod \"ed95027c-1ded-4127-a341-7ee81018d4b6\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.159853 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-ovsdbserver-sb\") pod \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\" (UID: \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\") " Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.159907 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pnwk\" (UniqueName: \"kubernetes.io/projected/52ffd7e9-8c09-43d2-b7dd-909a39e83051-kube-api-access-8pnwk\") pod \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\" (UID: \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\") " Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.159936 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ed95027c-1ded-4127-a341-7ee81018d4b6-pod-info\") pod \"ed95027c-1ded-4127-a341-7ee81018d4b6\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.159964 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ed95027c-1ded-4127-a341-7ee81018d4b6-erlang-cookie-secret\") pod \"ed95027c-1ded-4127-a341-7ee81018d4b6\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.160034 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-ovsdbserver-nb\") pod \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\" (UID: \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\") " Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.160069 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ed95027c-1ded-4127-a341-7ee81018d4b6-rabbitmq-erlang-cookie\") pod \"ed95027c-1ded-4127-a341-7ee81018d4b6\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.160109 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c69bp\" (UniqueName: \"kubernetes.io/projected/ed95027c-1ded-4127-a341-7ee81018d4b6-kube-api-access-c69bp\") pod \"ed95027c-1ded-4127-a341-7ee81018d4b6\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.160219 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-openstack-edpm-ipam\") pod \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\" (UID: \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\") " Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.160245 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ed95027c-1ded-4127-a341-7ee81018d4b6-rabbitmq-plugins\") pod \"ed95027c-1ded-4127-a341-7ee81018d4b6\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.160294 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-dns-swift-storage-0\") pod \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\" (UID: \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\") " Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.160317 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ed95027c-1ded-4127-a341-7ee81018d4b6-rabbitmq-tls\") pod \"ed95027c-1ded-4127-a341-7ee81018d4b6\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.160346 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ed95027c-1ded-4127-a341-7ee81018d4b6-server-conf\") pod \"ed95027c-1ded-4127-a341-7ee81018d4b6\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.160411 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ed95027c-1ded-4127-a341-7ee81018d4b6-plugins-conf\") pod \"ed95027c-1ded-4127-a341-7ee81018d4b6\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.160461 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-config\") pod \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\" (UID: \"52ffd7e9-8c09-43d2-b7dd-909a39e83051\") " Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.160486 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ed95027c-1ded-4127-a341-7ee81018d4b6\" (UID: \"ed95027c-1ded-4127-a341-7ee81018d4b6\") " Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.160918 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/83b046ba-a4ad-4e9b-b266-a23db4ef72ae-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-p7nf9\" (UID: \"83b046ba-a4ad-4e9b-b266-a23db4ef72ae\") " pod="openstack/dnsmasq-dns-55478c4467-p7nf9" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.160959 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83b046ba-a4ad-4e9b-b266-a23db4ef72ae-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-p7nf9\" (UID: \"83b046ba-a4ad-4e9b-b266-a23db4ef72ae\") " pod="openstack/dnsmasq-dns-55478c4467-p7nf9" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.160985 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83b046ba-a4ad-4e9b-b266-a23db4ef72ae-config\") pod \"dnsmasq-dns-55478c4467-p7nf9\" (UID: \"83b046ba-a4ad-4e9b-b266-a23db4ef72ae\") " pod="openstack/dnsmasq-dns-55478c4467-p7nf9" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.161126 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6fng\" (UniqueName: \"kubernetes.io/projected/83b046ba-a4ad-4e9b-b266-a23db4ef72ae-kube-api-access-w6fng\") pod \"dnsmasq-dns-55478c4467-p7nf9\" (UID: \"83b046ba-a4ad-4e9b-b266-a23db4ef72ae\") " pod="openstack/dnsmasq-dns-55478c4467-p7nf9" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.161171 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83b046ba-a4ad-4e9b-b266-a23db4ef72ae-dns-svc\") pod \"dnsmasq-dns-55478c4467-p7nf9\" (UID: \"83b046ba-a4ad-4e9b-b266-a23db4ef72ae\") " pod="openstack/dnsmasq-dns-55478c4467-p7nf9" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.161213 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83b046ba-a4ad-4e9b-b266-a23db4ef72ae-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-p7nf9\" (UID: \"83b046ba-a4ad-4e9b-b266-a23db4ef72ae\") " pod="openstack/dnsmasq-dns-55478c4467-p7nf9" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.161248 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83b046ba-a4ad-4e9b-b266-a23db4ef72ae-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-p7nf9\" (UID: \"83b046ba-a4ad-4e9b-b266-a23db4ef72ae\") " pod="openstack/dnsmasq-dns-55478c4467-p7nf9" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.161323 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "52ffd7e9-8c09-43d2-b7dd-909a39e83051" (UID: "52ffd7e9-8c09-43d2-b7dd-909a39e83051"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.161739 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed95027c-1ded-4127-a341-7ee81018d4b6-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ed95027c-1ded-4127-a341-7ee81018d4b6" (UID: "ed95027c-1ded-4127-a341-7ee81018d4b6"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.162297 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "52ffd7e9-8c09-43d2-b7dd-909a39e83051" (UID: "52ffd7e9-8c09-43d2-b7dd-909a39e83051"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.166216 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-config" (OuterVolumeSpecName: "config") pod "52ffd7e9-8c09-43d2-b7dd-909a39e83051" (UID: "52ffd7e9-8c09-43d2-b7dd-909a39e83051"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.167376 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed95027c-1ded-4127-a341-7ee81018d4b6-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ed95027c-1ded-4127-a341-7ee81018d4b6" (UID: "ed95027c-1ded-4127-a341-7ee81018d4b6"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.168054 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed95027c-1ded-4127-a341-7ee81018d4b6-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ed95027c-1ded-4127-a341-7ee81018d4b6" (UID: "ed95027c-1ded-4127-a341-7ee81018d4b6"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.168759 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "52ffd7e9-8c09-43d2-b7dd-909a39e83051" (UID: "52ffd7e9-8c09-43d2-b7dd-909a39e83051"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.171162 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "52ffd7e9-8c09-43d2-b7dd-909a39e83051" (UID: "52ffd7e9-8c09-43d2-b7dd-909a39e83051"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.182819 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "52ffd7e9-8c09-43d2-b7dd-909a39e83051" (UID: "52ffd7e9-8c09-43d2-b7dd-909a39e83051"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.192842 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ed95027c-1ded-4127-a341-7ee81018d4b6-pod-info" (OuterVolumeSpecName: "pod-info") pod "ed95027c-1ded-4127-a341-7ee81018d4b6" (UID: "ed95027c-1ded-4127-a341-7ee81018d4b6"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.255766 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "ed95027c-1ded-4127-a341-7ee81018d4b6" (UID: "ed95027c-1ded-4127-a341-7ee81018d4b6"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.255818 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52ffd7e9-8c09-43d2-b7dd-909a39e83051-kube-api-access-8pnwk" (OuterVolumeSpecName: "kube-api-access-8pnwk") pod "52ffd7e9-8c09-43d2-b7dd-909a39e83051" (UID: "52ffd7e9-8c09-43d2-b7dd-909a39e83051"). InnerVolumeSpecName "kube-api-access-8pnwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.255943 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed95027c-1ded-4127-a341-7ee81018d4b6-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ed95027c-1ded-4127-a341-7ee81018d4b6" (UID: "ed95027c-1ded-4127-a341-7ee81018d4b6"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.256024 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed95027c-1ded-4127-a341-7ee81018d4b6-kube-api-access-c69bp" (OuterVolumeSpecName: "kube-api-access-c69bp") pod "ed95027c-1ded-4127-a341-7ee81018d4b6" (UID: "ed95027c-1ded-4127-a341-7ee81018d4b6"). InnerVolumeSpecName "kube-api-access-c69bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.257144 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed95027c-1ded-4127-a341-7ee81018d4b6-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ed95027c-1ded-4127-a341-7ee81018d4b6" (UID: "ed95027c-1ded-4127-a341-7ee81018d4b6"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.263299 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6fng\" (UniqueName: \"kubernetes.io/projected/83b046ba-a4ad-4e9b-b266-a23db4ef72ae-kube-api-access-w6fng\") pod \"dnsmasq-dns-55478c4467-p7nf9\" (UID: \"83b046ba-a4ad-4e9b-b266-a23db4ef72ae\") " pod="openstack/dnsmasq-dns-55478c4467-p7nf9" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.263359 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83b046ba-a4ad-4e9b-b266-a23db4ef72ae-dns-svc\") pod \"dnsmasq-dns-55478c4467-p7nf9\" (UID: \"83b046ba-a4ad-4e9b-b266-a23db4ef72ae\") " pod="openstack/dnsmasq-dns-55478c4467-p7nf9" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.263396 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83b046ba-a4ad-4e9b-b266-a23db4ef72ae-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-p7nf9\" (UID: \"83b046ba-a4ad-4e9b-b266-a23db4ef72ae\") " pod="openstack/dnsmasq-dns-55478c4467-p7nf9" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.263427 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83b046ba-a4ad-4e9b-b266-a23db4ef72ae-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-p7nf9\" (UID: \"83b046ba-a4ad-4e9b-b266-a23db4ef72ae\") " pod="openstack/dnsmasq-dns-55478c4467-p7nf9" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.263467 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/83b046ba-a4ad-4e9b-b266-a23db4ef72ae-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-p7nf9\" (UID: \"83b046ba-a4ad-4e9b-b266-a23db4ef72ae\") " pod="openstack/dnsmasq-dns-55478c4467-p7nf9" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.263489 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83b046ba-a4ad-4e9b-b266-a23db4ef72ae-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-p7nf9\" (UID: \"83b046ba-a4ad-4e9b-b266-a23db4ef72ae\") " pod="openstack/dnsmasq-dns-55478c4467-p7nf9" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.263922 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83b046ba-a4ad-4e9b-b266-a23db4ef72ae-config\") pod \"dnsmasq-dns-55478c4467-p7nf9\" (UID: \"83b046ba-a4ad-4e9b-b266-a23db4ef72ae\") " pod="openstack/dnsmasq-dns-55478c4467-p7nf9" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.263994 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pnwk\" (UniqueName: \"kubernetes.io/projected/52ffd7e9-8c09-43d2-b7dd-909a39e83051-kube-api-access-8pnwk\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.264007 4734 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ed95027c-1ded-4127-a341-7ee81018d4b6-pod-info\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.264017 4734 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ed95027c-1ded-4127-a341-7ee81018d4b6-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.264029 4734 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.264039 4734 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ed95027c-1ded-4127-a341-7ee81018d4b6-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.264048 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c69bp\" (UniqueName: \"kubernetes.io/projected/ed95027c-1ded-4127-a341-7ee81018d4b6-kube-api-access-c69bp\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.264058 4734 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.264067 4734 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ed95027c-1ded-4127-a341-7ee81018d4b6-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.264076 4734 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.264085 4734 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ed95027c-1ded-4127-a341-7ee81018d4b6-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.264096 4734 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ed95027c-1ded-4127-a341-7ee81018d4b6-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.264105 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.264125 4734 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.264136 4734 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.264147 4734 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/52ffd7e9-8c09-43d2-b7dd-909a39e83051-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.283192 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83b046ba-a4ad-4e9b-b266-a23db4ef72ae-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-p7nf9\" (UID: \"83b046ba-a4ad-4e9b-b266-a23db4ef72ae\") " pod="openstack/dnsmasq-dns-55478c4467-p7nf9" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.283423 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83b046ba-a4ad-4e9b-b266-a23db4ef72ae-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-p7nf9\" (UID: \"83b046ba-a4ad-4e9b-b266-a23db4ef72ae\") " pod="openstack/dnsmasq-dns-55478c4467-p7nf9" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.283825 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/83b046ba-a4ad-4e9b-b266-a23db4ef72ae-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-p7nf9\" (UID: \"83b046ba-a4ad-4e9b-b266-a23db4ef72ae\") " pod="openstack/dnsmasq-dns-55478c4467-p7nf9" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.285349 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83b046ba-a4ad-4e9b-b266-a23db4ef72ae-dns-svc\") pod \"dnsmasq-dns-55478c4467-p7nf9\" (UID: \"83b046ba-a4ad-4e9b-b266-a23db4ef72ae\") " pod="openstack/dnsmasq-dns-55478c4467-p7nf9" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.293226 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83b046ba-a4ad-4e9b-b266-a23db4ef72ae-config\") pod \"dnsmasq-dns-55478c4467-p7nf9\" (UID: \"83b046ba-a4ad-4e9b-b266-a23db4ef72ae\") " pod="openstack/dnsmasq-dns-55478c4467-p7nf9" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.293481 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83b046ba-a4ad-4e9b-b266-a23db4ef72ae-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-p7nf9\" (UID: \"83b046ba-a4ad-4e9b-b266-a23db4ef72ae\") " pod="openstack/dnsmasq-dns-55478c4467-p7nf9" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.305983 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed95027c-1ded-4127-a341-7ee81018d4b6-config-data" (OuterVolumeSpecName: "config-data") pod "ed95027c-1ded-4127-a341-7ee81018d4b6" (UID: "ed95027c-1ded-4127-a341-7ee81018d4b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.310019 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6fng\" (UniqueName: \"kubernetes.io/projected/83b046ba-a4ad-4e9b-b266-a23db4ef72ae-kube-api-access-w6fng\") pod \"dnsmasq-dns-55478c4467-p7nf9\" (UID: \"83b046ba-a4ad-4e9b-b266-a23db4ef72ae\") " pod="openstack/dnsmasq-dns-55478c4467-p7nf9" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.342713 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-p7nf9" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.382182 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed95027c-1ded-4127-a341-7ee81018d4b6-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.386020 4734 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.403087 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed95027c-1ded-4127-a341-7ee81018d4b6-server-conf" (OuterVolumeSpecName: "server-conf") pod "ed95027c-1ded-4127-a341-7ee81018d4b6" (UID: "ed95027c-1ded-4127-a341-7ee81018d4b6"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.484254 4734 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ed95027c-1ded-4127-a341-7ee81018d4b6-server-conf\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.484310 4734 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.572188 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed95027c-1ded-4127-a341-7ee81018d4b6-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ed95027c-1ded-4127-a341-7ee81018d4b6" (UID: "ed95027c-1ded-4127-a341-7ee81018d4b6"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:42:48 crc kubenswrapper[4734]: I1205 23:42:48.586842 4734 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ed95027c-1ded-4127-a341-7ee81018d4b6-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.034993 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.059504 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"556dbce3-075c-473a-ab0d-ea67ffc3e144","Type":"ContainerStarted","Data":"e553fd50cb807864f8356f26f8b6727a47232635cf732aa724878aabbbe4a3b0"} Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.062447 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-49zd8" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.062968 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ed95027c-1ded-4127-a341-7ee81018d4b6","Type":"ContainerDied","Data":"e2856be192207e70f4ec1dd3befd5a54fb82ddf659138d1d1d0d6852a043dfce"} Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.063008 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.063064 4734 scope.go:117] "RemoveContainer" containerID="e568c4fef629437d46890910627d0796cf4be63170bfe6ba198f3522a7208650" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.133935 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-p7nf9"] Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.157382 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-49zd8"] Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.160770 4734 scope.go:117] "RemoveContainer" containerID="66e5a249cf9e8b0a22292ba791d1aa360ef84159f879636f2af4ddcac64c1e31" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.167415 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-49zd8"] Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.190720 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.215719 4734 scope.go:117] "RemoveContainer" containerID="66e5a249cf9e8b0a22292ba791d1aa360ef84159f879636f2af4ddcac64c1e31" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.216035 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.277865 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 23:42:49 crc kubenswrapper[4734]: E1205 23:42:49.278478 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed95027c-1ded-4127-a341-7ee81018d4b6" containerName="rabbitmq" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.278493 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed95027c-1ded-4127-a341-7ee81018d4b6" containerName="rabbitmq" Dec 05 23:42:49 crc kubenswrapper[4734]: E1205 23:42:49.278563 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed95027c-1ded-4127-a341-7ee81018d4b6" containerName="setup-container" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.278574 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed95027c-1ded-4127-a341-7ee81018d4b6" containerName="setup-container" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.278761 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed95027c-1ded-4127-a341-7ee81018d4b6" containerName="rabbitmq" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.280018 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.286508 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.286841 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.287316 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.289275 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.293113 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-ncglt" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.293123 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.293192 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.293207 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.442799 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.442879 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.442912 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.442944 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.442982 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcjw5\" (UniqueName: \"kubernetes.io/projected/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-kube-api-access-dcjw5\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.443011 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.443049 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.443078 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.443114 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.443161 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.443201 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: E1205 23:42:49.517838 4734 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_setup-container_rabbitmq-cell1-server-0_openstack_ed95027c-1ded-4127-a341-7ee81018d4b6_0 in pod sandbox e2856be192207e70f4ec1dd3befd5a54fb82ddf659138d1d1d0d6852a043dfce from index: no such id: '66e5a249cf9e8b0a22292ba791d1aa360ef84159f879636f2af4ddcac64c1e31'" containerID="66e5a249cf9e8b0a22292ba791d1aa360ef84159f879636f2af4ddcac64c1e31" Dec 05 23:42:49 crc kubenswrapper[4734]: E1205 23:42:49.517951 4734 kuberuntime_gc.go:150] "Failed to remove container" err="rpc error: code = Unknown desc = failed to delete container k8s_setup-container_rabbitmq-cell1-server-0_openstack_ed95027c-1ded-4127-a341-7ee81018d4b6_0 in pod sandbox e2856be192207e70f4ec1dd3befd5a54fb82ddf659138d1d1d0d6852a043dfce from index: no such id: '66e5a249cf9e8b0a22292ba791d1aa360ef84159f879636f2af4ddcac64c1e31'" containerID="66e5a249cf9e8b0a22292ba791d1aa360ef84159f879636f2af4ddcac64c1e31" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.545121 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.545202 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.545234 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.545253 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.545287 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcjw5\" (UniqueName: \"kubernetes.io/projected/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-kube-api-access-dcjw5\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.545316 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.545358 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.545381 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.545425 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.545487 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.545552 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.547509 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.547934 4734 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.548120 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.548198 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.548411 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.550037 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.551816 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.553744 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.554126 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.562677 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.588892 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcjw5\" (UniqueName: \"kubernetes.io/projected/34a9d7ac-2a42-4352-8eb3-23d34cfc5696-kube-api-access-dcjw5\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.608044 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"34a9d7ac-2a42-4352-8eb3-23d34cfc5696\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.653354 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52ffd7e9-8c09-43d2-b7dd-909a39e83051" path="/var/lib/kubelet/pods/52ffd7e9-8c09-43d2-b7dd-909a39e83051/volumes" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.657753 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c35eaa12-d993-4769-975b-35a5ac6609e0" path="/var/lib/kubelet/pods/c35eaa12-d993-4769-975b-35a5ac6609e0/volumes" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.661767 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed95027c-1ded-4127-a341-7ee81018d4b6" path="/var/lib/kubelet/pods/ed95027c-1ded-4127-a341-7ee81018d4b6/volumes" Dec 05 23:42:49 crc kubenswrapper[4734]: I1205 23:42:49.830040 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:42:50 crc kubenswrapper[4734]: I1205 23:42:50.081890 4734 generic.go:334] "Generic (PLEG): container finished" podID="83b046ba-a4ad-4e9b-b266-a23db4ef72ae" containerID="ea1bd111ea3c09a9c48937043924d72e5230434f8fd0fdde497d62116c8ca0a8" exitCode=0 Dec 05 23:42:50 crc kubenswrapper[4734]: I1205 23:42:50.082395 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-p7nf9" event={"ID":"83b046ba-a4ad-4e9b-b266-a23db4ef72ae","Type":"ContainerDied","Data":"ea1bd111ea3c09a9c48937043924d72e5230434f8fd0fdde497d62116c8ca0a8"} Dec 05 23:42:50 crc kubenswrapper[4734]: I1205 23:42:50.082433 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-p7nf9" event={"ID":"83b046ba-a4ad-4e9b-b266-a23db4ef72ae","Type":"ContainerStarted","Data":"e13f444ce4d5e346a98d48525320c46be58f414a1c85a4b69bf084529c8153e4"} Dec 05 23:42:50 crc kubenswrapper[4734]: I1205 23:42:50.379114 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 23:42:50 crc kubenswrapper[4734]: I1205 23:42:50.444562 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:42:50 crc kubenswrapper[4734]: I1205 23:42:50.444639 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:42:50 crc kubenswrapper[4734]: I1205 23:42:50.444703 4734 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 05 23:42:50 crc kubenswrapper[4734]: I1205 23:42:50.445759 4734 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b56de5effd3c2004c857decd42f072613bd8b7411853b07107e3e799cc6c9cfb"} pod="openshift-machine-config-operator/machine-config-daemon-vn94d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 23:42:50 crc kubenswrapper[4734]: I1205 23:42:50.445829 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" containerID="cri-o://b56de5effd3c2004c857decd42f072613bd8b7411853b07107e3e799cc6c9cfb" gracePeriod=600 Dec 05 23:42:51 crc kubenswrapper[4734]: I1205 23:42:51.092590 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"34a9d7ac-2a42-4352-8eb3-23d34cfc5696","Type":"ContainerStarted","Data":"ff854f65fa0857a8d02f263f49156d0fe7052b220fbdf520c516b2501c121fd3"} Dec 05 23:42:51 crc kubenswrapper[4734]: I1205 23:42:51.095423 4734 generic.go:334] "Generic (PLEG): container finished" podID="65758270-a7a7-46b5-af95-0588daf9fa86" containerID="b56de5effd3c2004c857decd42f072613bd8b7411853b07107e3e799cc6c9cfb" exitCode=0 Dec 05 23:42:51 crc kubenswrapper[4734]: I1205 23:42:51.095710 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerDied","Data":"b56de5effd3c2004c857decd42f072613bd8b7411853b07107e3e799cc6c9cfb"} Dec 05 23:42:51 crc kubenswrapper[4734]: I1205 23:42:51.097393 4734 scope.go:117] "RemoveContainer" containerID="5119dd9005e526fae1b15071e6d704440bd8834afc5ec6ce50aaa9f27c74ff90" Dec 05 23:42:51 crc kubenswrapper[4734]: I1205 23:42:51.104064 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-p7nf9" event={"ID":"83b046ba-a4ad-4e9b-b266-a23db4ef72ae","Type":"ContainerStarted","Data":"0f74e6d0c0b47c0226392bf318f6097666d3437d7f84d6499875fd19441fb54f"} Dec 05 23:42:51 crc kubenswrapper[4734]: I1205 23:42:51.104234 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55478c4467-p7nf9" Dec 05 23:42:51 crc kubenswrapper[4734]: I1205 23:42:51.137015 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55478c4467-p7nf9" podStartSLOduration=4.136992343 podStartE2EDuration="4.136992343s" podCreationTimestamp="2025-12-05 23:42:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:42:51.128298431 +0000 UTC m=+1391.811702717" watchObservedRunningTime="2025-12-05 23:42:51.136992343 +0000 UTC m=+1391.820396639" Dec 05 23:42:52 crc kubenswrapper[4734]: I1205 23:42:52.117310 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"556dbce3-075c-473a-ab0d-ea67ffc3e144","Type":"ContainerStarted","Data":"6e306ae768ee92fe6f6b9048ec889fde996d3f2ee4f1e523eae2355baa6c3e1e"} Dec 05 23:42:52 crc kubenswrapper[4734]: I1205 23:42:52.123029 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerStarted","Data":"bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec"} Dec 05 23:42:53 crc kubenswrapper[4734]: I1205 23:42:53.150449 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"34a9d7ac-2a42-4352-8eb3-23d34cfc5696","Type":"ContainerStarted","Data":"8d126e674a7215d5df1508dd31b24f22e530661c0ef23f493f718ab7db8cee65"} Dec 05 23:42:58 crc kubenswrapper[4734]: I1205 23:42:58.344710 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55478c4467-p7nf9" Dec 05 23:42:58 crc kubenswrapper[4734]: I1205 23:42:58.434759 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-tklww"] Dec 05 23:42:58 crc kubenswrapper[4734]: I1205 23:42:58.435082 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" podUID="e4da9f35-b56d-47e7-9492-6e9379754584" containerName="dnsmasq-dns" containerID="cri-o://f0806da6272183f3f8a186f6210f91fca2724e1b742f689684f3bc5b212b57c6" gracePeriod=10 Dec 05 23:42:58 crc kubenswrapper[4734]: I1205 23:42:58.926261 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" Dec 05 23:42:58 crc kubenswrapper[4734]: I1205 23:42:58.978554 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-config\") pod \"e4da9f35-b56d-47e7-9492-6e9379754584\" (UID: \"e4da9f35-b56d-47e7-9492-6e9379754584\") " Dec 05 23:42:58 crc kubenswrapper[4734]: I1205 23:42:58.978625 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-ovsdbserver-sb\") pod \"e4da9f35-b56d-47e7-9492-6e9379754584\" (UID: \"e4da9f35-b56d-47e7-9492-6e9379754584\") " Dec 05 23:42:58 crc kubenswrapper[4734]: I1205 23:42:58.978704 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-dns-svc\") pod \"e4da9f35-b56d-47e7-9492-6e9379754584\" (UID: \"e4da9f35-b56d-47e7-9492-6e9379754584\") " Dec 05 23:42:58 crc kubenswrapper[4734]: I1205 23:42:58.978936 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blrh8\" (UniqueName: \"kubernetes.io/projected/e4da9f35-b56d-47e7-9492-6e9379754584-kube-api-access-blrh8\") pod \"e4da9f35-b56d-47e7-9492-6e9379754584\" (UID: \"e4da9f35-b56d-47e7-9492-6e9379754584\") " Dec 05 23:42:58 crc kubenswrapper[4734]: I1205 23:42:58.979162 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-dns-swift-storage-0\") pod \"e4da9f35-b56d-47e7-9492-6e9379754584\" (UID: \"e4da9f35-b56d-47e7-9492-6e9379754584\") " Dec 05 23:42:58 crc kubenswrapper[4734]: I1205 23:42:58.979280 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-ovsdbserver-nb\") pod \"e4da9f35-b56d-47e7-9492-6e9379754584\" (UID: \"e4da9f35-b56d-47e7-9492-6e9379754584\") " Dec 05 23:42:58 crc kubenswrapper[4734]: I1205 23:42:58.988876 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4da9f35-b56d-47e7-9492-6e9379754584-kube-api-access-blrh8" (OuterVolumeSpecName: "kube-api-access-blrh8") pod "e4da9f35-b56d-47e7-9492-6e9379754584" (UID: "e4da9f35-b56d-47e7-9492-6e9379754584"). InnerVolumeSpecName "kube-api-access-blrh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:42:59 crc kubenswrapper[4734]: I1205 23:42:59.042590 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e4da9f35-b56d-47e7-9492-6e9379754584" (UID: "e4da9f35-b56d-47e7-9492-6e9379754584"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:42:59 crc kubenswrapper[4734]: I1205 23:42:59.044936 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e4da9f35-b56d-47e7-9492-6e9379754584" (UID: "e4da9f35-b56d-47e7-9492-6e9379754584"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:42:59 crc kubenswrapper[4734]: I1205 23:42:59.049513 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e4da9f35-b56d-47e7-9492-6e9379754584" (UID: "e4da9f35-b56d-47e7-9492-6e9379754584"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:42:59 crc kubenswrapper[4734]: I1205 23:42:59.066222 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-config" (OuterVolumeSpecName: "config") pod "e4da9f35-b56d-47e7-9492-6e9379754584" (UID: "e4da9f35-b56d-47e7-9492-6e9379754584"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:42:59 crc kubenswrapper[4734]: I1205 23:42:59.075427 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e4da9f35-b56d-47e7-9492-6e9379754584" (UID: "e4da9f35-b56d-47e7-9492-6e9379754584"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:42:59 crc kubenswrapper[4734]: I1205 23:42:59.082480 4734 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:59 crc kubenswrapper[4734]: I1205 23:42:59.082509 4734 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:59 crc kubenswrapper[4734]: I1205 23:42:59.082551 4734 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-config\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:59 crc kubenswrapper[4734]: I1205 23:42:59.082562 4734 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:59 crc kubenswrapper[4734]: I1205 23:42:59.082571 4734 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4da9f35-b56d-47e7-9492-6e9379754584-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:59 crc kubenswrapper[4734]: I1205 23:42:59.082583 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blrh8\" (UniqueName: \"kubernetes.io/projected/e4da9f35-b56d-47e7-9492-6e9379754584-kube-api-access-blrh8\") on node \"crc\" DevicePath \"\"" Dec 05 23:42:59 crc kubenswrapper[4734]: I1205 23:42:59.219707 4734 generic.go:334] "Generic (PLEG): container finished" podID="e4da9f35-b56d-47e7-9492-6e9379754584" containerID="f0806da6272183f3f8a186f6210f91fca2724e1b742f689684f3bc5b212b57c6" exitCode=0 Dec 05 23:42:59 crc kubenswrapper[4734]: I1205 23:42:59.219831 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" Dec 05 23:42:59 crc kubenswrapper[4734]: I1205 23:42:59.219891 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" event={"ID":"e4da9f35-b56d-47e7-9492-6e9379754584","Type":"ContainerDied","Data":"f0806da6272183f3f8a186f6210f91fca2724e1b742f689684f3bc5b212b57c6"} Dec 05 23:42:59 crc kubenswrapper[4734]: I1205 23:42:59.222857 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" event={"ID":"e4da9f35-b56d-47e7-9492-6e9379754584","Type":"ContainerDied","Data":"4fda1812c6502facdd95bfe28189f58aa8947cb0c5d7c8f6bb20ec5f1fb3ad4a"} Dec 05 23:42:59 crc kubenswrapper[4734]: I1205 23:42:59.222889 4734 scope.go:117] "RemoveContainer" containerID="f0806da6272183f3f8a186f6210f91fca2724e1b742f689684f3bc5b212b57c6" Dec 05 23:42:59 crc kubenswrapper[4734]: I1205 23:42:59.265634 4734 scope.go:117] "RemoveContainer" containerID="f00caf4b47d9d95e2c7019ceb937384399123fab1923f9aa32c7a80977fe948c" Dec 05 23:42:59 crc kubenswrapper[4734]: I1205 23:42:59.268907 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-tklww"] Dec 05 23:42:59 crc kubenswrapper[4734]: I1205 23:42:59.285497 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-tklww"] Dec 05 23:42:59 crc kubenswrapper[4734]: I1205 23:42:59.289117 4734 scope.go:117] "RemoveContainer" containerID="f0806da6272183f3f8a186f6210f91fca2724e1b742f689684f3bc5b212b57c6" Dec 05 23:42:59 crc kubenswrapper[4734]: E1205 23:42:59.289762 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0806da6272183f3f8a186f6210f91fca2724e1b742f689684f3bc5b212b57c6\": container with ID starting with f0806da6272183f3f8a186f6210f91fca2724e1b742f689684f3bc5b212b57c6 not found: ID does not exist" containerID="f0806da6272183f3f8a186f6210f91fca2724e1b742f689684f3bc5b212b57c6" Dec 05 23:42:59 crc kubenswrapper[4734]: I1205 23:42:59.289822 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0806da6272183f3f8a186f6210f91fca2724e1b742f689684f3bc5b212b57c6"} err="failed to get container status \"f0806da6272183f3f8a186f6210f91fca2724e1b742f689684f3bc5b212b57c6\": rpc error: code = NotFound desc = could not find container \"f0806da6272183f3f8a186f6210f91fca2724e1b742f689684f3bc5b212b57c6\": container with ID starting with f0806da6272183f3f8a186f6210f91fca2724e1b742f689684f3bc5b212b57c6 not found: ID does not exist" Dec 05 23:42:59 crc kubenswrapper[4734]: I1205 23:42:59.289857 4734 scope.go:117] "RemoveContainer" containerID="f00caf4b47d9d95e2c7019ceb937384399123fab1923f9aa32c7a80977fe948c" Dec 05 23:42:59 crc kubenswrapper[4734]: E1205 23:42:59.290449 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f00caf4b47d9d95e2c7019ceb937384399123fab1923f9aa32c7a80977fe948c\": container with ID starting with f00caf4b47d9d95e2c7019ceb937384399123fab1923f9aa32c7a80977fe948c not found: ID does not exist" containerID="f00caf4b47d9d95e2c7019ceb937384399123fab1923f9aa32c7a80977fe948c" Dec 05 23:42:59 crc kubenswrapper[4734]: I1205 23:42:59.290485 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f00caf4b47d9d95e2c7019ceb937384399123fab1923f9aa32c7a80977fe948c"} err="failed to get container status \"f00caf4b47d9d95e2c7019ceb937384399123fab1923f9aa32c7a80977fe948c\": rpc error: code = NotFound desc = could not find container \"f00caf4b47d9d95e2c7019ceb937384399123fab1923f9aa32c7a80977fe948c\": container with ID starting with f00caf4b47d9d95e2c7019ceb937384399123fab1923f9aa32c7a80977fe948c not found: ID does not exist" Dec 05 23:42:59 crc kubenswrapper[4734]: I1205 23:42:59.629474 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4da9f35-b56d-47e7-9492-6e9379754584" path="/var/lib/kubelet/pods/e4da9f35-b56d-47e7-9492-6e9379754584/volumes" Dec 05 23:43:03 crc kubenswrapper[4734]: I1205 23:43:03.729214 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-89c5cd4d5-tklww" podUID="e4da9f35-b56d-47e7-9492-6e9379754584" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.197:5353: i/o timeout" Dec 05 23:43:06 crc kubenswrapper[4734]: I1205 23:43:06.882264 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p"] Dec 05 23:43:06 crc kubenswrapper[4734]: E1205 23:43:06.883622 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4da9f35-b56d-47e7-9492-6e9379754584" containerName="init" Dec 05 23:43:06 crc kubenswrapper[4734]: I1205 23:43:06.883638 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4da9f35-b56d-47e7-9492-6e9379754584" containerName="init" Dec 05 23:43:06 crc kubenswrapper[4734]: E1205 23:43:06.883675 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4da9f35-b56d-47e7-9492-6e9379754584" containerName="dnsmasq-dns" Dec 05 23:43:06 crc kubenswrapper[4734]: I1205 23:43:06.883682 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4da9f35-b56d-47e7-9492-6e9379754584" containerName="dnsmasq-dns" Dec 05 23:43:06 crc kubenswrapper[4734]: I1205 23:43:06.883879 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4da9f35-b56d-47e7-9492-6e9379754584" containerName="dnsmasq-dns" Dec 05 23:43:06 crc kubenswrapper[4734]: I1205 23:43:06.885200 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p" Dec 05 23:43:06 crc kubenswrapper[4734]: I1205 23:43:06.888158 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 23:43:06 crc kubenswrapper[4734]: I1205 23:43:06.888758 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 23:43:06 crc kubenswrapper[4734]: I1205 23:43:06.889168 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gsdqx" Dec 05 23:43:06 crc kubenswrapper[4734]: I1205 23:43:06.891329 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 23:43:06 crc kubenswrapper[4734]: I1205 23:43:06.896856 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p"] Dec 05 23:43:06 crc kubenswrapper[4734]: I1205 23:43:06.975977 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12cd9906-9f9f-42ba-8869-54f39ae29366-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p\" (UID: \"12cd9906-9f9f-42ba-8869-54f39ae29366\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p" Dec 05 23:43:06 crc kubenswrapper[4734]: I1205 23:43:06.976111 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12cd9906-9f9f-42ba-8869-54f39ae29366-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p\" (UID: \"12cd9906-9f9f-42ba-8869-54f39ae29366\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p" Dec 05 23:43:06 crc kubenswrapper[4734]: I1205 23:43:06.976167 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktplm\" (UniqueName: \"kubernetes.io/projected/12cd9906-9f9f-42ba-8869-54f39ae29366-kube-api-access-ktplm\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p\" (UID: \"12cd9906-9f9f-42ba-8869-54f39ae29366\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p" Dec 05 23:43:06 crc kubenswrapper[4734]: I1205 23:43:06.976221 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12cd9906-9f9f-42ba-8869-54f39ae29366-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p\" (UID: \"12cd9906-9f9f-42ba-8869-54f39ae29366\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p" Dec 05 23:43:07 crc kubenswrapper[4734]: I1205 23:43:07.077736 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12cd9906-9f9f-42ba-8869-54f39ae29366-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p\" (UID: \"12cd9906-9f9f-42ba-8869-54f39ae29366\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p" Dec 05 23:43:07 crc kubenswrapper[4734]: I1205 23:43:07.077846 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12cd9906-9f9f-42ba-8869-54f39ae29366-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p\" (UID: \"12cd9906-9f9f-42ba-8869-54f39ae29366\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p" Dec 05 23:43:07 crc kubenswrapper[4734]: I1205 23:43:07.077894 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktplm\" (UniqueName: \"kubernetes.io/projected/12cd9906-9f9f-42ba-8869-54f39ae29366-kube-api-access-ktplm\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p\" (UID: \"12cd9906-9f9f-42ba-8869-54f39ae29366\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p" Dec 05 23:43:07 crc kubenswrapper[4734]: I1205 23:43:07.077939 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12cd9906-9f9f-42ba-8869-54f39ae29366-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p\" (UID: \"12cd9906-9f9f-42ba-8869-54f39ae29366\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p" Dec 05 23:43:07 crc kubenswrapper[4734]: I1205 23:43:07.095314 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12cd9906-9f9f-42ba-8869-54f39ae29366-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p\" (UID: \"12cd9906-9f9f-42ba-8869-54f39ae29366\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p" Dec 05 23:43:07 crc kubenswrapper[4734]: I1205 23:43:07.125453 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12cd9906-9f9f-42ba-8869-54f39ae29366-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p\" (UID: \"12cd9906-9f9f-42ba-8869-54f39ae29366\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p" Dec 05 23:43:07 crc kubenswrapper[4734]: I1205 23:43:07.126267 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12cd9906-9f9f-42ba-8869-54f39ae29366-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p\" (UID: \"12cd9906-9f9f-42ba-8869-54f39ae29366\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p" Dec 05 23:43:07 crc kubenswrapper[4734]: I1205 23:43:07.159489 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktplm\" (UniqueName: \"kubernetes.io/projected/12cd9906-9f9f-42ba-8869-54f39ae29366-kube-api-access-ktplm\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p\" (UID: \"12cd9906-9f9f-42ba-8869-54f39ae29366\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p" Dec 05 23:43:07 crc kubenswrapper[4734]: I1205 23:43:07.214109 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p" Dec 05 23:43:07 crc kubenswrapper[4734]: W1205 23:43:07.855864 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12cd9906_9f9f_42ba_8869_54f39ae29366.slice/crio-29db5ecf5defefdbcd5a9d425753a84af511b96f9bb5a14044059ca4280df18d WatchSource:0}: Error finding container 29db5ecf5defefdbcd5a9d425753a84af511b96f9bb5a14044059ca4280df18d: Status 404 returned error can't find the container with id 29db5ecf5defefdbcd5a9d425753a84af511b96f9bb5a14044059ca4280df18d Dec 05 23:43:07 crc kubenswrapper[4734]: I1205 23:43:07.861035 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p"] Dec 05 23:43:08 crc kubenswrapper[4734]: I1205 23:43:08.328539 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p" event={"ID":"12cd9906-9f9f-42ba-8869-54f39ae29366","Type":"ContainerStarted","Data":"29db5ecf5defefdbcd5a9d425753a84af511b96f9bb5a14044059ca4280df18d"} Dec 05 23:43:17 crc kubenswrapper[4734]: I1205 23:43:17.451292 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p" event={"ID":"12cd9906-9f9f-42ba-8869-54f39ae29366","Type":"ContainerStarted","Data":"07b72c61a7755bb1778bccb2f9c2d62914f43823423f820b4181c1515f07f50b"} Dec 05 23:43:17 crc kubenswrapper[4734]: I1205 23:43:17.478282 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p" podStartSLOduration=2.529836132 podStartE2EDuration="11.478256362s" podCreationTimestamp="2025-12-05 23:43:06 +0000 UTC" firstStartedPulling="2025-12-05 23:43:07.85883093 +0000 UTC m=+1408.542235206" lastFinishedPulling="2025-12-05 23:43:16.80725116 +0000 UTC m=+1417.490655436" observedRunningTime="2025-12-05 23:43:17.473663619 +0000 UTC m=+1418.157067895" watchObservedRunningTime="2025-12-05 23:43:17.478256362 +0000 UTC m=+1418.161660638" Dec 05 23:43:24 crc kubenswrapper[4734]: I1205 23:43:24.526731 4734 generic.go:334] "Generic (PLEG): container finished" podID="556dbce3-075c-473a-ab0d-ea67ffc3e144" containerID="6e306ae768ee92fe6f6b9048ec889fde996d3f2ee4f1e523eae2355baa6c3e1e" exitCode=0 Dec 05 23:43:24 crc kubenswrapper[4734]: I1205 23:43:24.526844 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"556dbce3-075c-473a-ab0d-ea67ffc3e144","Type":"ContainerDied","Data":"6e306ae768ee92fe6f6b9048ec889fde996d3f2ee4f1e523eae2355baa6c3e1e"} Dec 05 23:43:25 crc kubenswrapper[4734]: I1205 23:43:25.542095 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"556dbce3-075c-473a-ab0d-ea67ffc3e144","Type":"ContainerStarted","Data":"a01448e26cc035422c0ec17748b0aec717219e5117ee5befbaacc1173c5ee2b5"} Dec 05 23:43:25 crc kubenswrapper[4734]: I1205 23:43:25.542804 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 05 23:43:25 crc kubenswrapper[4734]: I1205 23:43:25.544797 4734 generic.go:334] "Generic (PLEG): container finished" podID="34a9d7ac-2a42-4352-8eb3-23d34cfc5696" containerID="8d126e674a7215d5df1508dd31b24f22e530661c0ef23f493f718ab7db8cee65" exitCode=0 Dec 05 23:43:25 crc kubenswrapper[4734]: I1205 23:43:25.544836 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"34a9d7ac-2a42-4352-8eb3-23d34cfc5696","Type":"ContainerDied","Data":"8d126e674a7215d5df1508dd31b24f22e530661c0ef23f493f718ab7db8cee65"} Dec 05 23:43:25 crc kubenswrapper[4734]: I1205 23:43:25.585592 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.585572717 podStartE2EDuration="38.585572717s" podCreationTimestamp="2025-12-05 23:42:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:43:25.582702508 +0000 UTC m=+1426.266106784" watchObservedRunningTime="2025-12-05 23:43:25.585572717 +0000 UTC m=+1426.268976993" Dec 05 23:43:26 crc kubenswrapper[4734]: I1205 23:43:26.567870 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"34a9d7ac-2a42-4352-8eb3-23d34cfc5696","Type":"ContainerStarted","Data":"ccf94b510c9790e19a72f913e680e277f8c1c22adf37457c8fb163c87403d16d"} Dec 05 23:43:26 crc kubenswrapper[4734]: I1205 23:43:26.568585 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:43:26 crc kubenswrapper[4734]: I1205 23:43:26.605686 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.605650438 podStartE2EDuration="37.605650438s" podCreationTimestamp="2025-12-05 23:42:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 23:43:26.603961636 +0000 UTC m=+1427.287365912" watchObservedRunningTime="2025-12-05 23:43:26.605650438 +0000 UTC m=+1427.289054714" Dec 05 23:43:30 crc kubenswrapper[4734]: I1205 23:43:30.609490 4734 generic.go:334] "Generic (PLEG): container finished" podID="12cd9906-9f9f-42ba-8869-54f39ae29366" containerID="07b72c61a7755bb1778bccb2f9c2d62914f43823423f820b4181c1515f07f50b" exitCode=0 Dec 05 23:43:30 crc kubenswrapper[4734]: I1205 23:43:30.609607 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p" event={"ID":"12cd9906-9f9f-42ba-8869-54f39ae29366","Type":"ContainerDied","Data":"07b72c61a7755bb1778bccb2f9c2d62914f43823423f820b4181c1515f07f50b"} Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.043677 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p" Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.139609 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12cd9906-9f9f-42ba-8869-54f39ae29366-ssh-key\") pod \"12cd9906-9f9f-42ba-8869-54f39ae29366\" (UID: \"12cd9906-9f9f-42ba-8869-54f39ae29366\") " Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.139753 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12cd9906-9f9f-42ba-8869-54f39ae29366-repo-setup-combined-ca-bundle\") pod \"12cd9906-9f9f-42ba-8869-54f39ae29366\" (UID: \"12cd9906-9f9f-42ba-8869-54f39ae29366\") " Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.139951 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktplm\" (UniqueName: \"kubernetes.io/projected/12cd9906-9f9f-42ba-8869-54f39ae29366-kube-api-access-ktplm\") pod \"12cd9906-9f9f-42ba-8869-54f39ae29366\" (UID: \"12cd9906-9f9f-42ba-8869-54f39ae29366\") " Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.139985 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12cd9906-9f9f-42ba-8869-54f39ae29366-inventory\") pod \"12cd9906-9f9f-42ba-8869-54f39ae29366\" (UID: \"12cd9906-9f9f-42ba-8869-54f39ae29366\") " Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.149169 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12cd9906-9f9f-42ba-8869-54f39ae29366-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "12cd9906-9f9f-42ba-8869-54f39ae29366" (UID: "12cd9906-9f9f-42ba-8869-54f39ae29366"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.149246 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12cd9906-9f9f-42ba-8869-54f39ae29366-kube-api-access-ktplm" (OuterVolumeSpecName: "kube-api-access-ktplm") pod "12cd9906-9f9f-42ba-8869-54f39ae29366" (UID: "12cd9906-9f9f-42ba-8869-54f39ae29366"). InnerVolumeSpecName "kube-api-access-ktplm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.185963 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12cd9906-9f9f-42ba-8869-54f39ae29366-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "12cd9906-9f9f-42ba-8869-54f39ae29366" (UID: "12cd9906-9f9f-42ba-8869-54f39ae29366"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.194085 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12cd9906-9f9f-42ba-8869-54f39ae29366-inventory" (OuterVolumeSpecName: "inventory") pod "12cd9906-9f9f-42ba-8869-54f39ae29366" (UID: "12cd9906-9f9f-42ba-8869-54f39ae29366"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.243133 4734 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/12cd9906-9f9f-42ba-8869-54f39ae29366-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.243169 4734 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12cd9906-9f9f-42ba-8869-54f39ae29366-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.243184 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktplm\" (UniqueName: \"kubernetes.io/projected/12cd9906-9f9f-42ba-8869-54f39ae29366-kube-api-access-ktplm\") on node \"crc\" DevicePath \"\"" Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.243197 4734 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12cd9906-9f9f-42ba-8869-54f39ae29366-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.648062 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p" event={"ID":"12cd9906-9f9f-42ba-8869-54f39ae29366","Type":"ContainerDied","Data":"29db5ecf5defefdbcd5a9d425753a84af511b96f9bb5a14044059ca4280df18d"} Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.648127 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29db5ecf5defefdbcd5a9d425753a84af511b96f9bb5a14044059ca4280df18d" Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.648213 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p" Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.767187 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-z9lz9"] Dec 05 23:43:32 crc kubenswrapper[4734]: E1205 23:43:32.767793 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12cd9906-9f9f-42ba-8869-54f39ae29366" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.767818 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="12cd9906-9f9f-42ba-8869-54f39ae29366" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.768048 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="12cd9906-9f9f-42ba-8869-54f39ae29366" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.768981 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z9lz9" Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.776204 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gsdqx" Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.776423 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.776602 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.776921 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.782190 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-z9lz9"] Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.856939 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29e8f09f-ca59-420f-ae3c-8bdb696d653a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-z9lz9\" (UID: \"29e8f09f-ca59-420f-ae3c-8bdb696d653a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z9lz9" Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.857138 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd2fx\" (UniqueName: \"kubernetes.io/projected/29e8f09f-ca59-420f-ae3c-8bdb696d653a-kube-api-access-xd2fx\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-z9lz9\" (UID: \"29e8f09f-ca59-420f-ae3c-8bdb696d653a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z9lz9" Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.857181 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29e8f09f-ca59-420f-ae3c-8bdb696d653a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-z9lz9\" (UID: \"29e8f09f-ca59-420f-ae3c-8bdb696d653a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z9lz9" Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.959272 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd2fx\" (UniqueName: \"kubernetes.io/projected/29e8f09f-ca59-420f-ae3c-8bdb696d653a-kube-api-access-xd2fx\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-z9lz9\" (UID: \"29e8f09f-ca59-420f-ae3c-8bdb696d653a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z9lz9" Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.959348 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29e8f09f-ca59-420f-ae3c-8bdb696d653a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-z9lz9\" (UID: \"29e8f09f-ca59-420f-ae3c-8bdb696d653a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z9lz9" Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.959410 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29e8f09f-ca59-420f-ae3c-8bdb696d653a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-z9lz9\" (UID: \"29e8f09f-ca59-420f-ae3c-8bdb696d653a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z9lz9" Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.966444 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29e8f09f-ca59-420f-ae3c-8bdb696d653a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-z9lz9\" (UID: \"29e8f09f-ca59-420f-ae3c-8bdb696d653a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z9lz9" Dec 05 23:43:32 crc kubenswrapper[4734]: I1205 23:43:32.966985 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29e8f09f-ca59-420f-ae3c-8bdb696d653a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-z9lz9\" (UID: \"29e8f09f-ca59-420f-ae3c-8bdb696d653a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z9lz9" Dec 05 23:43:33 crc kubenswrapper[4734]: I1205 23:43:33.008923 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd2fx\" (UniqueName: \"kubernetes.io/projected/29e8f09f-ca59-420f-ae3c-8bdb696d653a-kube-api-access-xd2fx\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-z9lz9\" (UID: \"29e8f09f-ca59-420f-ae3c-8bdb696d653a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z9lz9" Dec 05 23:43:33 crc kubenswrapper[4734]: I1205 23:43:33.104048 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z9lz9" Dec 05 23:43:34 crc kubenswrapper[4734]: I1205 23:43:34.910582 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-z9lz9"] Dec 05 23:43:35 crc kubenswrapper[4734]: I1205 23:43:35.698015 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z9lz9" event={"ID":"29e8f09f-ca59-420f-ae3c-8bdb696d653a","Type":"ContainerStarted","Data":"ed6ab7fd107123443d3cfe3f965e401617abc35fac99181c1048f8c7a1428be4"} Dec 05 23:43:35 crc kubenswrapper[4734]: I1205 23:43:35.698436 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z9lz9" event={"ID":"29e8f09f-ca59-420f-ae3c-8bdb696d653a","Type":"ContainerStarted","Data":"2861a14d68daab6e76f961ebed8f3d58e9bdeca3739ab8ec615b594c7b40b556"} Dec 05 23:43:38 crc kubenswrapper[4734]: I1205 23:43:38.141956 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 05 23:43:38 crc kubenswrapper[4734]: I1205 23:43:38.172199 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z9lz9" podStartSLOduration=5.711445017 podStartE2EDuration="6.172161919s" podCreationTimestamp="2025-12-05 23:43:32 +0000 UTC" firstStartedPulling="2025-12-05 23:43:34.925206922 +0000 UTC m=+1435.608611198" lastFinishedPulling="2025-12-05 23:43:35.385923824 +0000 UTC m=+1436.069328100" observedRunningTime="2025-12-05 23:43:35.724328314 +0000 UTC m=+1436.407732610" watchObservedRunningTime="2025-12-05 23:43:38.172161919 +0000 UTC m=+1438.855566205" Dec 05 23:43:38 crc kubenswrapper[4734]: I1205 23:43:38.734335 4734 generic.go:334] "Generic (PLEG): container finished" podID="29e8f09f-ca59-420f-ae3c-8bdb696d653a" containerID="ed6ab7fd107123443d3cfe3f965e401617abc35fac99181c1048f8c7a1428be4" exitCode=0 Dec 05 23:43:38 crc kubenswrapper[4734]: I1205 23:43:38.734759 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z9lz9" event={"ID":"29e8f09f-ca59-420f-ae3c-8bdb696d653a","Type":"ContainerDied","Data":"ed6ab7fd107123443d3cfe3f965e401617abc35fac99181c1048f8c7a1428be4"} Dec 05 23:43:39 crc kubenswrapper[4734]: I1205 23:43:39.833839 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 05 23:43:40 crc kubenswrapper[4734]: I1205 23:43:40.323182 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z9lz9" Dec 05 23:43:40 crc kubenswrapper[4734]: I1205 23:43:40.333963 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29e8f09f-ca59-420f-ae3c-8bdb696d653a-ssh-key\") pod \"29e8f09f-ca59-420f-ae3c-8bdb696d653a\" (UID: \"29e8f09f-ca59-420f-ae3c-8bdb696d653a\") " Dec 05 23:43:40 crc kubenswrapper[4734]: I1205 23:43:40.334095 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29e8f09f-ca59-420f-ae3c-8bdb696d653a-inventory\") pod \"29e8f09f-ca59-420f-ae3c-8bdb696d653a\" (UID: \"29e8f09f-ca59-420f-ae3c-8bdb696d653a\") " Dec 05 23:43:40 crc kubenswrapper[4734]: I1205 23:43:40.334149 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd2fx\" (UniqueName: \"kubernetes.io/projected/29e8f09f-ca59-420f-ae3c-8bdb696d653a-kube-api-access-xd2fx\") pod \"29e8f09f-ca59-420f-ae3c-8bdb696d653a\" (UID: \"29e8f09f-ca59-420f-ae3c-8bdb696d653a\") " Dec 05 23:43:40 crc kubenswrapper[4734]: I1205 23:43:40.346023 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29e8f09f-ca59-420f-ae3c-8bdb696d653a-kube-api-access-xd2fx" (OuterVolumeSpecName: "kube-api-access-xd2fx") pod "29e8f09f-ca59-420f-ae3c-8bdb696d653a" (UID: "29e8f09f-ca59-420f-ae3c-8bdb696d653a"). InnerVolumeSpecName "kube-api-access-xd2fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:43:40 crc kubenswrapper[4734]: I1205 23:43:40.390088 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e8f09f-ca59-420f-ae3c-8bdb696d653a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "29e8f09f-ca59-420f-ae3c-8bdb696d653a" (UID: "29e8f09f-ca59-420f-ae3c-8bdb696d653a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:43:40 crc kubenswrapper[4734]: I1205 23:43:40.390698 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e8f09f-ca59-420f-ae3c-8bdb696d653a-inventory" (OuterVolumeSpecName: "inventory") pod "29e8f09f-ca59-420f-ae3c-8bdb696d653a" (UID: "29e8f09f-ca59-420f-ae3c-8bdb696d653a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:43:40 crc kubenswrapper[4734]: I1205 23:43:40.437444 4734 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29e8f09f-ca59-420f-ae3c-8bdb696d653a-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 23:43:40 crc kubenswrapper[4734]: I1205 23:43:40.437505 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd2fx\" (UniqueName: \"kubernetes.io/projected/29e8f09f-ca59-420f-ae3c-8bdb696d653a-kube-api-access-xd2fx\") on node \"crc\" DevicePath \"\"" Dec 05 23:43:40 crc kubenswrapper[4734]: I1205 23:43:40.437557 4734 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29e8f09f-ca59-420f-ae3c-8bdb696d653a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 23:43:40 crc kubenswrapper[4734]: I1205 23:43:40.764254 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z9lz9" event={"ID":"29e8f09f-ca59-420f-ae3c-8bdb696d653a","Type":"ContainerDied","Data":"2861a14d68daab6e76f961ebed8f3d58e9bdeca3739ab8ec615b594c7b40b556"} Dec 05 23:43:40 crc kubenswrapper[4734]: I1205 23:43:40.764321 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2861a14d68daab6e76f961ebed8f3d58e9bdeca3739ab8ec615b594c7b40b556" Dec 05 23:43:40 crc kubenswrapper[4734]: I1205 23:43:40.764432 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-z9lz9" Dec 05 23:43:40 crc kubenswrapper[4734]: I1205 23:43:40.855244 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw"] Dec 05 23:43:40 crc kubenswrapper[4734]: E1205 23:43:40.855975 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29e8f09f-ca59-420f-ae3c-8bdb696d653a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 05 23:43:40 crc kubenswrapper[4734]: I1205 23:43:40.856022 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="29e8f09f-ca59-420f-ae3c-8bdb696d653a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 05 23:43:40 crc kubenswrapper[4734]: I1205 23:43:40.856334 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="29e8f09f-ca59-420f-ae3c-8bdb696d653a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 05 23:43:40 crc kubenswrapper[4734]: I1205 23:43:40.857720 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw" Dec 05 23:43:40 crc kubenswrapper[4734]: I1205 23:43:40.863733 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 23:43:40 crc kubenswrapper[4734]: I1205 23:43:40.864158 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 23:43:40 crc kubenswrapper[4734]: I1205 23:43:40.864357 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 23:43:40 crc kubenswrapper[4734]: I1205 23:43:40.864666 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gsdqx" Dec 05 23:43:40 crc kubenswrapper[4734]: I1205 23:43:40.872480 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw"] Dec 05 23:43:40 crc kubenswrapper[4734]: I1205 23:43:40.954613 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/faef139d-614e-4c50-a383-8dd231a47b83-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw\" (UID: \"faef139d-614e-4c50-a383-8dd231a47b83\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw" Dec 05 23:43:40 crc kubenswrapper[4734]: I1205 23:43:40.954699 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p99vr\" (UniqueName: \"kubernetes.io/projected/faef139d-614e-4c50-a383-8dd231a47b83-kube-api-access-p99vr\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw\" (UID: \"faef139d-614e-4c50-a383-8dd231a47b83\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw" Dec 05 23:43:40 crc kubenswrapper[4734]: I1205 23:43:40.954745 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faef139d-614e-4c50-a383-8dd231a47b83-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw\" (UID: \"faef139d-614e-4c50-a383-8dd231a47b83\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw" Dec 05 23:43:40 crc kubenswrapper[4734]: I1205 23:43:40.955022 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/faef139d-614e-4c50-a383-8dd231a47b83-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw\" (UID: \"faef139d-614e-4c50-a383-8dd231a47b83\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw" Dec 05 23:43:41 crc kubenswrapper[4734]: I1205 23:43:41.056816 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/faef139d-614e-4c50-a383-8dd231a47b83-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw\" (UID: \"faef139d-614e-4c50-a383-8dd231a47b83\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw" Dec 05 23:43:41 crc kubenswrapper[4734]: I1205 23:43:41.057375 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p99vr\" (UniqueName: \"kubernetes.io/projected/faef139d-614e-4c50-a383-8dd231a47b83-kube-api-access-p99vr\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw\" (UID: \"faef139d-614e-4c50-a383-8dd231a47b83\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw" Dec 05 23:43:41 crc kubenswrapper[4734]: I1205 23:43:41.057433 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faef139d-614e-4c50-a383-8dd231a47b83-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw\" (UID: \"faef139d-614e-4c50-a383-8dd231a47b83\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw" Dec 05 23:43:41 crc kubenswrapper[4734]: I1205 23:43:41.057623 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/faef139d-614e-4c50-a383-8dd231a47b83-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw\" (UID: \"faef139d-614e-4c50-a383-8dd231a47b83\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw" Dec 05 23:43:41 crc kubenswrapper[4734]: I1205 23:43:41.063864 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/faef139d-614e-4c50-a383-8dd231a47b83-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw\" (UID: \"faef139d-614e-4c50-a383-8dd231a47b83\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw" Dec 05 23:43:41 crc kubenswrapper[4734]: I1205 23:43:41.065983 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faef139d-614e-4c50-a383-8dd231a47b83-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw\" (UID: \"faef139d-614e-4c50-a383-8dd231a47b83\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw" Dec 05 23:43:41 crc kubenswrapper[4734]: I1205 23:43:41.067185 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/faef139d-614e-4c50-a383-8dd231a47b83-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw\" (UID: \"faef139d-614e-4c50-a383-8dd231a47b83\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw" Dec 05 23:43:41 crc kubenswrapper[4734]: I1205 23:43:41.077599 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p99vr\" (UniqueName: \"kubernetes.io/projected/faef139d-614e-4c50-a383-8dd231a47b83-kube-api-access-p99vr\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw\" (UID: \"faef139d-614e-4c50-a383-8dd231a47b83\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw" Dec 05 23:43:41 crc kubenswrapper[4734]: I1205 23:43:41.226028 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw" Dec 05 23:43:41 crc kubenswrapper[4734]: I1205 23:43:41.827169 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw"] Dec 05 23:43:41 crc kubenswrapper[4734]: W1205 23:43:41.831026 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfaef139d_614e_4c50_a383_8dd231a47b83.slice/crio-9b54b978335fc419391228ed882d28af6ba7c84daa620c6a83a7efaf76ebcfe6 WatchSource:0}: Error finding container 9b54b978335fc419391228ed882d28af6ba7c84daa620c6a83a7efaf76ebcfe6: Status 404 returned error can't find the container with id 9b54b978335fc419391228ed882d28af6ba7c84daa620c6a83a7efaf76ebcfe6 Dec 05 23:43:42 crc kubenswrapper[4734]: I1205 23:43:42.795698 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw" event={"ID":"faef139d-614e-4c50-a383-8dd231a47b83","Type":"ContainerStarted","Data":"0e6d681385369a0b51703c270598fb237e8d0bd9d8058d34e8a4cc738ceb7c65"} Dec 05 23:43:42 crc kubenswrapper[4734]: I1205 23:43:42.796232 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw" event={"ID":"faef139d-614e-4c50-a383-8dd231a47b83","Type":"ContainerStarted","Data":"9b54b978335fc419391228ed882d28af6ba7c84daa620c6a83a7efaf76ebcfe6"} Dec 05 23:43:42 crc kubenswrapper[4734]: I1205 23:43:42.821037 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw" podStartSLOduration=2.311545416 podStartE2EDuration="2.821013462s" podCreationTimestamp="2025-12-05 23:43:40 +0000 UTC" firstStartedPulling="2025-12-05 23:43:41.834857816 +0000 UTC m=+1442.518262092" lastFinishedPulling="2025-12-05 23:43:42.344325852 +0000 UTC m=+1443.027730138" observedRunningTime="2025-12-05 23:43:42.811808518 +0000 UTC m=+1443.495212794" watchObservedRunningTime="2025-12-05 23:43:42.821013462 +0000 UTC m=+1443.504417738" Dec 05 23:43:49 crc kubenswrapper[4734]: I1205 23:43:49.789548 4734 scope.go:117] "RemoveContainer" containerID="af8dc54646134df08947f581fbe33154b97a057f76899065313e09bd16c36ef1" Dec 05 23:44:45 crc kubenswrapper[4734]: I1205 23:44:45.462039 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xs8dw"] Dec 05 23:44:45 crc kubenswrapper[4734]: I1205 23:44:45.465737 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xs8dw" Dec 05 23:44:45 crc kubenswrapper[4734]: I1205 23:44:45.482994 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xs8dw"] Dec 05 23:44:45 crc kubenswrapper[4734]: I1205 23:44:45.499348 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dfc201c-15e8-4615-939c-12031587e0be-catalog-content\") pod \"redhat-marketplace-xs8dw\" (UID: \"6dfc201c-15e8-4615-939c-12031587e0be\") " pod="openshift-marketplace/redhat-marketplace-xs8dw" Dec 05 23:44:45 crc kubenswrapper[4734]: I1205 23:44:45.499432 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm9bb\" (UniqueName: \"kubernetes.io/projected/6dfc201c-15e8-4615-939c-12031587e0be-kube-api-access-tm9bb\") pod \"redhat-marketplace-xs8dw\" (UID: \"6dfc201c-15e8-4615-939c-12031587e0be\") " pod="openshift-marketplace/redhat-marketplace-xs8dw" Dec 05 23:44:45 crc kubenswrapper[4734]: I1205 23:44:45.499509 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dfc201c-15e8-4615-939c-12031587e0be-utilities\") pod \"redhat-marketplace-xs8dw\" (UID: \"6dfc201c-15e8-4615-939c-12031587e0be\") " pod="openshift-marketplace/redhat-marketplace-xs8dw" Dec 05 23:44:45 crc kubenswrapper[4734]: I1205 23:44:45.602151 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dfc201c-15e8-4615-939c-12031587e0be-utilities\") pod \"redhat-marketplace-xs8dw\" (UID: \"6dfc201c-15e8-4615-939c-12031587e0be\") " pod="openshift-marketplace/redhat-marketplace-xs8dw" Dec 05 23:44:45 crc kubenswrapper[4734]: I1205 23:44:45.602324 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dfc201c-15e8-4615-939c-12031587e0be-catalog-content\") pod \"redhat-marketplace-xs8dw\" (UID: \"6dfc201c-15e8-4615-939c-12031587e0be\") " pod="openshift-marketplace/redhat-marketplace-xs8dw" Dec 05 23:44:45 crc kubenswrapper[4734]: I1205 23:44:45.602366 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm9bb\" (UniqueName: \"kubernetes.io/projected/6dfc201c-15e8-4615-939c-12031587e0be-kube-api-access-tm9bb\") pod \"redhat-marketplace-xs8dw\" (UID: \"6dfc201c-15e8-4615-939c-12031587e0be\") " pod="openshift-marketplace/redhat-marketplace-xs8dw" Dec 05 23:44:45 crc kubenswrapper[4734]: I1205 23:44:45.603037 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dfc201c-15e8-4615-939c-12031587e0be-utilities\") pod \"redhat-marketplace-xs8dw\" (UID: \"6dfc201c-15e8-4615-939c-12031587e0be\") " pod="openshift-marketplace/redhat-marketplace-xs8dw" Dec 05 23:44:45 crc kubenswrapper[4734]: I1205 23:44:45.603169 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dfc201c-15e8-4615-939c-12031587e0be-catalog-content\") pod \"redhat-marketplace-xs8dw\" (UID: \"6dfc201c-15e8-4615-939c-12031587e0be\") " pod="openshift-marketplace/redhat-marketplace-xs8dw" Dec 05 23:44:45 crc kubenswrapper[4734]: I1205 23:44:45.645423 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm9bb\" (UniqueName: \"kubernetes.io/projected/6dfc201c-15e8-4615-939c-12031587e0be-kube-api-access-tm9bb\") pod \"redhat-marketplace-xs8dw\" (UID: \"6dfc201c-15e8-4615-939c-12031587e0be\") " pod="openshift-marketplace/redhat-marketplace-xs8dw" Dec 05 23:44:45 crc kubenswrapper[4734]: I1205 23:44:45.798096 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xs8dw" Dec 05 23:44:46 crc kubenswrapper[4734]: I1205 23:44:46.362284 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xs8dw"] Dec 05 23:44:46 crc kubenswrapper[4734]: I1205 23:44:46.515984 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xs8dw" event={"ID":"6dfc201c-15e8-4615-939c-12031587e0be","Type":"ContainerStarted","Data":"f34844360c8d299860c66567a8c63667a1c0ba430bbf2f8ab911a93558f01e96"} Dec 05 23:44:47 crc kubenswrapper[4734]: I1205 23:44:47.530368 4734 generic.go:334] "Generic (PLEG): container finished" podID="6dfc201c-15e8-4615-939c-12031587e0be" containerID="26bc956bc425ea470284ea8cbce922dedd4d6e032de03eeb6f3cde52766c2858" exitCode=0 Dec 05 23:44:47 crc kubenswrapper[4734]: I1205 23:44:47.530590 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xs8dw" event={"ID":"6dfc201c-15e8-4615-939c-12031587e0be","Type":"ContainerDied","Data":"26bc956bc425ea470284ea8cbce922dedd4d6e032de03eeb6f3cde52766c2858"} Dec 05 23:44:48 crc kubenswrapper[4734]: I1205 23:44:48.542826 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xs8dw" event={"ID":"6dfc201c-15e8-4615-939c-12031587e0be","Type":"ContainerStarted","Data":"eff12ddcba8215acd448e37e5f1d10db440d601db1c67ae2916bb88d866f373d"} Dec 05 23:44:49 crc kubenswrapper[4734]: I1205 23:44:49.555184 4734 generic.go:334] "Generic (PLEG): container finished" podID="6dfc201c-15e8-4615-939c-12031587e0be" containerID="eff12ddcba8215acd448e37e5f1d10db440d601db1c67ae2916bb88d866f373d" exitCode=0 Dec 05 23:44:49 crc kubenswrapper[4734]: I1205 23:44:49.555312 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xs8dw" event={"ID":"6dfc201c-15e8-4615-939c-12031587e0be","Type":"ContainerDied","Data":"eff12ddcba8215acd448e37e5f1d10db440d601db1c67ae2916bb88d866f373d"} Dec 05 23:44:49 crc kubenswrapper[4734]: I1205 23:44:49.893653 4734 scope.go:117] "RemoveContainer" containerID="c84a2d92f1ff7d0be99abd71752073d96b69b8ea001fe342c500401c23b2e406" Dec 05 23:44:50 crc kubenswrapper[4734]: I1205 23:44:50.568810 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xs8dw" event={"ID":"6dfc201c-15e8-4615-939c-12031587e0be","Type":"ContainerStarted","Data":"5664c3016463a68af4a44754ee7ad0227fe98d04208d9e1881dcc987f9832ff7"} Dec 05 23:44:50 crc kubenswrapper[4734]: I1205 23:44:50.602947 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xs8dw" podStartSLOduration=3.171926351 podStartE2EDuration="5.602913492s" podCreationTimestamp="2025-12-05 23:44:45 +0000 UTC" firstStartedPulling="2025-12-05 23:44:47.53464851 +0000 UTC m=+1508.218052786" lastFinishedPulling="2025-12-05 23:44:49.965635641 +0000 UTC m=+1510.649039927" observedRunningTime="2025-12-05 23:44:50.59009759 +0000 UTC m=+1511.273501866" watchObservedRunningTime="2025-12-05 23:44:50.602913492 +0000 UTC m=+1511.286317768" Dec 05 23:44:55 crc kubenswrapper[4734]: I1205 23:44:55.798766 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xs8dw" Dec 05 23:44:55 crc kubenswrapper[4734]: I1205 23:44:55.799736 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xs8dw" Dec 05 23:44:55 crc kubenswrapper[4734]: I1205 23:44:55.865847 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xs8dw" Dec 05 23:44:56 crc kubenswrapper[4734]: I1205 23:44:56.684696 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xs8dw" Dec 05 23:44:56 crc kubenswrapper[4734]: I1205 23:44:56.754592 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xs8dw"] Dec 05 23:44:58 crc kubenswrapper[4734]: I1205 23:44:58.653707 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xs8dw" podUID="6dfc201c-15e8-4615-939c-12031587e0be" containerName="registry-server" containerID="cri-o://5664c3016463a68af4a44754ee7ad0227fe98d04208d9e1881dcc987f9832ff7" gracePeriod=2 Dec 05 23:44:59 crc kubenswrapper[4734]: I1205 23:44:59.155152 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xs8dw" Dec 05 23:44:59 crc kubenswrapper[4734]: I1205 23:44:59.225576 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dfc201c-15e8-4615-939c-12031587e0be-catalog-content\") pod \"6dfc201c-15e8-4615-939c-12031587e0be\" (UID: \"6dfc201c-15e8-4615-939c-12031587e0be\") " Dec 05 23:44:59 crc kubenswrapper[4734]: I1205 23:44:59.226396 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm9bb\" (UniqueName: \"kubernetes.io/projected/6dfc201c-15e8-4615-939c-12031587e0be-kube-api-access-tm9bb\") pod \"6dfc201c-15e8-4615-939c-12031587e0be\" (UID: \"6dfc201c-15e8-4615-939c-12031587e0be\") " Dec 05 23:44:59 crc kubenswrapper[4734]: I1205 23:44:59.226647 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dfc201c-15e8-4615-939c-12031587e0be-utilities\") pod \"6dfc201c-15e8-4615-939c-12031587e0be\" (UID: \"6dfc201c-15e8-4615-939c-12031587e0be\") " Dec 05 23:44:59 crc kubenswrapper[4734]: I1205 23:44:59.228178 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dfc201c-15e8-4615-939c-12031587e0be-utilities" (OuterVolumeSpecName: "utilities") pod "6dfc201c-15e8-4615-939c-12031587e0be" (UID: "6dfc201c-15e8-4615-939c-12031587e0be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:44:59 crc kubenswrapper[4734]: I1205 23:44:59.246604 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dfc201c-15e8-4615-939c-12031587e0be-kube-api-access-tm9bb" (OuterVolumeSpecName: "kube-api-access-tm9bb") pod "6dfc201c-15e8-4615-939c-12031587e0be" (UID: "6dfc201c-15e8-4615-939c-12031587e0be"). InnerVolumeSpecName "kube-api-access-tm9bb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:44:59 crc kubenswrapper[4734]: I1205 23:44:59.257989 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dfc201c-15e8-4615-939c-12031587e0be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6dfc201c-15e8-4615-939c-12031587e0be" (UID: "6dfc201c-15e8-4615-939c-12031587e0be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:44:59 crc kubenswrapper[4734]: I1205 23:44:59.331608 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dfc201c-15e8-4615-939c-12031587e0be-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:44:59 crc kubenswrapper[4734]: I1205 23:44:59.331640 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dfc201c-15e8-4615-939c-12031587e0be-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:44:59 crc kubenswrapper[4734]: I1205 23:44:59.331653 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tm9bb\" (UniqueName: \"kubernetes.io/projected/6dfc201c-15e8-4615-939c-12031587e0be-kube-api-access-tm9bb\") on node \"crc\" DevicePath \"\"" Dec 05 23:44:59 crc kubenswrapper[4734]: I1205 23:44:59.667750 4734 generic.go:334] "Generic (PLEG): container finished" podID="6dfc201c-15e8-4615-939c-12031587e0be" containerID="5664c3016463a68af4a44754ee7ad0227fe98d04208d9e1881dcc987f9832ff7" exitCode=0 Dec 05 23:44:59 crc kubenswrapper[4734]: I1205 23:44:59.667811 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xs8dw" Dec 05 23:44:59 crc kubenswrapper[4734]: I1205 23:44:59.667808 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xs8dw" event={"ID":"6dfc201c-15e8-4615-939c-12031587e0be","Type":"ContainerDied","Data":"5664c3016463a68af4a44754ee7ad0227fe98d04208d9e1881dcc987f9832ff7"} Dec 05 23:44:59 crc kubenswrapper[4734]: I1205 23:44:59.667945 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xs8dw" event={"ID":"6dfc201c-15e8-4615-939c-12031587e0be","Type":"ContainerDied","Data":"f34844360c8d299860c66567a8c63667a1c0ba430bbf2f8ab911a93558f01e96"} Dec 05 23:44:59 crc kubenswrapper[4734]: I1205 23:44:59.667967 4734 scope.go:117] "RemoveContainer" containerID="5664c3016463a68af4a44754ee7ad0227fe98d04208d9e1881dcc987f9832ff7" Dec 05 23:44:59 crc kubenswrapper[4734]: I1205 23:44:59.702972 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xs8dw"] Dec 05 23:44:59 crc kubenswrapper[4734]: I1205 23:44:59.708919 4734 scope.go:117] "RemoveContainer" containerID="eff12ddcba8215acd448e37e5f1d10db440d601db1c67ae2916bb88d866f373d" Dec 05 23:44:59 crc kubenswrapper[4734]: I1205 23:44:59.740934 4734 scope.go:117] "RemoveContainer" containerID="26bc956bc425ea470284ea8cbce922dedd4d6e032de03eeb6f3cde52766c2858" Dec 05 23:44:59 crc kubenswrapper[4734]: I1205 23:44:59.742749 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xs8dw"] Dec 05 23:44:59 crc kubenswrapper[4734]: I1205 23:44:59.787506 4734 scope.go:117] "RemoveContainer" containerID="5664c3016463a68af4a44754ee7ad0227fe98d04208d9e1881dcc987f9832ff7" Dec 05 23:44:59 crc kubenswrapper[4734]: E1205 23:44:59.788037 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5664c3016463a68af4a44754ee7ad0227fe98d04208d9e1881dcc987f9832ff7\": container with ID starting with 5664c3016463a68af4a44754ee7ad0227fe98d04208d9e1881dcc987f9832ff7 not found: ID does not exist" containerID="5664c3016463a68af4a44754ee7ad0227fe98d04208d9e1881dcc987f9832ff7" Dec 05 23:44:59 crc kubenswrapper[4734]: I1205 23:44:59.788086 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5664c3016463a68af4a44754ee7ad0227fe98d04208d9e1881dcc987f9832ff7"} err="failed to get container status \"5664c3016463a68af4a44754ee7ad0227fe98d04208d9e1881dcc987f9832ff7\": rpc error: code = NotFound desc = could not find container \"5664c3016463a68af4a44754ee7ad0227fe98d04208d9e1881dcc987f9832ff7\": container with ID starting with 5664c3016463a68af4a44754ee7ad0227fe98d04208d9e1881dcc987f9832ff7 not found: ID does not exist" Dec 05 23:44:59 crc kubenswrapper[4734]: I1205 23:44:59.788120 4734 scope.go:117] "RemoveContainer" containerID="eff12ddcba8215acd448e37e5f1d10db440d601db1c67ae2916bb88d866f373d" Dec 05 23:44:59 crc kubenswrapper[4734]: E1205 23:44:59.788721 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eff12ddcba8215acd448e37e5f1d10db440d601db1c67ae2916bb88d866f373d\": container with ID starting with eff12ddcba8215acd448e37e5f1d10db440d601db1c67ae2916bb88d866f373d not found: ID does not exist" containerID="eff12ddcba8215acd448e37e5f1d10db440d601db1c67ae2916bb88d866f373d" Dec 05 23:44:59 crc kubenswrapper[4734]: I1205 23:44:59.788799 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eff12ddcba8215acd448e37e5f1d10db440d601db1c67ae2916bb88d866f373d"} err="failed to get container status \"eff12ddcba8215acd448e37e5f1d10db440d601db1c67ae2916bb88d866f373d\": rpc error: code = NotFound desc = could not find container \"eff12ddcba8215acd448e37e5f1d10db440d601db1c67ae2916bb88d866f373d\": container with ID starting with eff12ddcba8215acd448e37e5f1d10db440d601db1c67ae2916bb88d866f373d not found: ID does not exist" Dec 05 23:44:59 crc kubenswrapper[4734]: I1205 23:44:59.788846 4734 scope.go:117] "RemoveContainer" containerID="26bc956bc425ea470284ea8cbce922dedd4d6e032de03eeb6f3cde52766c2858" Dec 05 23:44:59 crc kubenswrapper[4734]: E1205 23:44:59.789257 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26bc956bc425ea470284ea8cbce922dedd4d6e032de03eeb6f3cde52766c2858\": container with ID starting with 26bc956bc425ea470284ea8cbce922dedd4d6e032de03eeb6f3cde52766c2858 not found: ID does not exist" containerID="26bc956bc425ea470284ea8cbce922dedd4d6e032de03eeb6f3cde52766c2858" Dec 05 23:44:59 crc kubenswrapper[4734]: I1205 23:44:59.789296 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26bc956bc425ea470284ea8cbce922dedd4d6e032de03eeb6f3cde52766c2858"} err="failed to get container status \"26bc956bc425ea470284ea8cbce922dedd4d6e032de03eeb6f3cde52766c2858\": rpc error: code = NotFound desc = could not find container \"26bc956bc425ea470284ea8cbce922dedd4d6e032de03eeb6f3cde52766c2858\": container with ID starting with 26bc956bc425ea470284ea8cbce922dedd4d6e032de03eeb6f3cde52766c2858 not found: ID does not exist" Dec 05 23:45:00 crc kubenswrapper[4734]: I1205 23:45:00.156714 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416305-th7b6"] Dec 05 23:45:00 crc kubenswrapper[4734]: E1205 23:45:00.157245 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dfc201c-15e8-4615-939c-12031587e0be" containerName="extract-utilities" Dec 05 23:45:00 crc kubenswrapper[4734]: I1205 23:45:00.157270 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfc201c-15e8-4615-939c-12031587e0be" containerName="extract-utilities" Dec 05 23:45:00 crc kubenswrapper[4734]: E1205 23:45:00.157307 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dfc201c-15e8-4615-939c-12031587e0be" containerName="extract-content" Dec 05 23:45:00 crc kubenswrapper[4734]: I1205 23:45:00.157314 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfc201c-15e8-4615-939c-12031587e0be" containerName="extract-content" Dec 05 23:45:00 crc kubenswrapper[4734]: E1205 23:45:00.157335 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dfc201c-15e8-4615-939c-12031587e0be" containerName="registry-server" Dec 05 23:45:00 crc kubenswrapper[4734]: I1205 23:45:00.157343 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfc201c-15e8-4615-939c-12031587e0be" containerName="registry-server" Dec 05 23:45:00 crc kubenswrapper[4734]: I1205 23:45:00.157581 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dfc201c-15e8-4615-939c-12031587e0be" containerName="registry-server" Dec 05 23:45:00 crc kubenswrapper[4734]: I1205 23:45:00.158495 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416305-th7b6" Dec 05 23:45:00 crc kubenswrapper[4734]: I1205 23:45:00.164511 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 23:45:00 crc kubenswrapper[4734]: I1205 23:45:00.171132 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 23:45:00 crc kubenswrapper[4734]: I1205 23:45:00.175411 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416305-th7b6"] Dec 05 23:45:00 crc kubenswrapper[4734]: I1205 23:45:00.249483 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4980d445-086f-4a87-9cfa-b5b4e6196a09-secret-volume\") pod \"collect-profiles-29416305-th7b6\" (UID: \"4980d445-086f-4a87-9cfa-b5b4e6196a09\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416305-th7b6" Dec 05 23:45:00 crc kubenswrapper[4734]: I1205 23:45:00.249578 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5nkw\" (UniqueName: \"kubernetes.io/projected/4980d445-086f-4a87-9cfa-b5b4e6196a09-kube-api-access-j5nkw\") pod \"collect-profiles-29416305-th7b6\" (UID: \"4980d445-086f-4a87-9cfa-b5b4e6196a09\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416305-th7b6" Dec 05 23:45:00 crc kubenswrapper[4734]: I1205 23:45:00.250038 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4980d445-086f-4a87-9cfa-b5b4e6196a09-config-volume\") pod \"collect-profiles-29416305-th7b6\" (UID: \"4980d445-086f-4a87-9cfa-b5b4e6196a09\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416305-th7b6" Dec 05 23:45:00 crc kubenswrapper[4734]: I1205 23:45:00.352887 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4980d445-086f-4a87-9cfa-b5b4e6196a09-secret-volume\") pod \"collect-profiles-29416305-th7b6\" (UID: \"4980d445-086f-4a87-9cfa-b5b4e6196a09\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416305-th7b6" Dec 05 23:45:00 crc kubenswrapper[4734]: I1205 23:45:00.352978 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5nkw\" (UniqueName: \"kubernetes.io/projected/4980d445-086f-4a87-9cfa-b5b4e6196a09-kube-api-access-j5nkw\") pod \"collect-profiles-29416305-th7b6\" (UID: \"4980d445-086f-4a87-9cfa-b5b4e6196a09\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416305-th7b6" Dec 05 23:45:00 crc kubenswrapper[4734]: I1205 23:45:00.353190 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4980d445-086f-4a87-9cfa-b5b4e6196a09-config-volume\") pod \"collect-profiles-29416305-th7b6\" (UID: \"4980d445-086f-4a87-9cfa-b5b4e6196a09\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416305-th7b6" Dec 05 23:45:00 crc kubenswrapper[4734]: I1205 23:45:00.354501 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4980d445-086f-4a87-9cfa-b5b4e6196a09-config-volume\") pod \"collect-profiles-29416305-th7b6\" (UID: \"4980d445-086f-4a87-9cfa-b5b4e6196a09\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416305-th7b6" Dec 05 23:45:00 crc kubenswrapper[4734]: I1205 23:45:00.360563 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4980d445-086f-4a87-9cfa-b5b4e6196a09-secret-volume\") pod \"collect-profiles-29416305-th7b6\" (UID: \"4980d445-086f-4a87-9cfa-b5b4e6196a09\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416305-th7b6" Dec 05 23:45:00 crc kubenswrapper[4734]: I1205 23:45:00.375462 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5nkw\" (UniqueName: \"kubernetes.io/projected/4980d445-086f-4a87-9cfa-b5b4e6196a09-kube-api-access-j5nkw\") pod \"collect-profiles-29416305-th7b6\" (UID: \"4980d445-086f-4a87-9cfa-b5b4e6196a09\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416305-th7b6" Dec 05 23:45:00 crc kubenswrapper[4734]: I1205 23:45:00.480308 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416305-th7b6" Dec 05 23:45:01 crc kubenswrapper[4734]: I1205 23:45:01.016545 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416305-th7b6"] Dec 05 23:45:01 crc kubenswrapper[4734]: I1205 23:45:01.630286 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dfc201c-15e8-4615-939c-12031587e0be" path="/var/lib/kubelet/pods/6dfc201c-15e8-4615-939c-12031587e0be/volumes" Dec 05 23:45:01 crc kubenswrapper[4734]: I1205 23:45:01.695488 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416305-th7b6" event={"ID":"4980d445-086f-4a87-9cfa-b5b4e6196a09","Type":"ContainerDied","Data":"239f2595428af802099ded42c3311bee633818cade65a8719e9bbbeed3d0f823"} Dec 05 23:45:01 crc kubenswrapper[4734]: I1205 23:45:01.695273 4734 generic.go:334] "Generic (PLEG): container finished" podID="4980d445-086f-4a87-9cfa-b5b4e6196a09" containerID="239f2595428af802099ded42c3311bee633818cade65a8719e9bbbeed3d0f823" exitCode=0 Dec 05 23:45:01 crc kubenswrapper[4734]: I1205 23:45:01.696216 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416305-th7b6" event={"ID":"4980d445-086f-4a87-9cfa-b5b4e6196a09","Type":"ContainerStarted","Data":"308288dd7c8e00623ac8c7404af6424b20d78372c08ab4f40793c87ac890d606"} Dec 05 23:45:03 crc kubenswrapper[4734]: I1205 23:45:03.112348 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416305-th7b6" Dec 05 23:45:03 crc kubenswrapper[4734]: I1205 23:45:03.227630 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5nkw\" (UniqueName: \"kubernetes.io/projected/4980d445-086f-4a87-9cfa-b5b4e6196a09-kube-api-access-j5nkw\") pod \"4980d445-086f-4a87-9cfa-b5b4e6196a09\" (UID: \"4980d445-086f-4a87-9cfa-b5b4e6196a09\") " Dec 05 23:45:03 crc kubenswrapper[4734]: I1205 23:45:03.227974 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4980d445-086f-4a87-9cfa-b5b4e6196a09-config-volume\") pod \"4980d445-086f-4a87-9cfa-b5b4e6196a09\" (UID: \"4980d445-086f-4a87-9cfa-b5b4e6196a09\") " Dec 05 23:45:03 crc kubenswrapper[4734]: I1205 23:45:03.228045 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4980d445-086f-4a87-9cfa-b5b4e6196a09-secret-volume\") pod \"4980d445-086f-4a87-9cfa-b5b4e6196a09\" (UID: \"4980d445-086f-4a87-9cfa-b5b4e6196a09\") " Dec 05 23:45:03 crc kubenswrapper[4734]: I1205 23:45:03.229192 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4980d445-086f-4a87-9cfa-b5b4e6196a09-config-volume" (OuterVolumeSpecName: "config-volume") pod "4980d445-086f-4a87-9cfa-b5b4e6196a09" (UID: "4980d445-086f-4a87-9cfa-b5b4e6196a09"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:45:03 crc kubenswrapper[4734]: I1205 23:45:03.235993 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4980d445-086f-4a87-9cfa-b5b4e6196a09-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4980d445-086f-4a87-9cfa-b5b4e6196a09" (UID: "4980d445-086f-4a87-9cfa-b5b4e6196a09"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:45:03 crc kubenswrapper[4734]: I1205 23:45:03.236447 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4980d445-086f-4a87-9cfa-b5b4e6196a09-kube-api-access-j5nkw" (OuterVolumeSpecName: "kube-api-access-j5nkw") pod "4980d445-086f-4a87-9cfa-b5b4e6196a09" (UID: "4980d445-086f-4a87-9cfa-b5b4e6196a09"). InnerVolumeSpecName "kube-api-access-j5nkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:45:03 crc kubenswrapper[4734]: I1205 23:45:03.330670 4734 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4980d445-086f-4a87-9cfa-b5b4e6196a09-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 23:45:03 crc kubenswrapper[4734]: I1205 23:45:03.330709 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5nkw\" (UniqueName: \"kubernetes.io/projected/4980d445-086f-4a87-9cfa-b5b4e6196a09-kube-api-access-j5nkw\") on node \"crc\" DevicePath \"\"" Dec 05 23:45:03 crc kubenswrapper[4734]: I1205 23:45:03.330722 4734 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4980d445-086f-4a87-9cfa-b5b4e6196a09-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 23:45:03 crc kubenswrapper[4734]: I1205 23:45:03.719738 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416305-th7b6" event={"ID":"4980d445-086f-4a87-9cfa-b5b4e6196a09","Type":"ContainerDied","Data":"308288dd7c8e00623ac8c7404af6424b20d78372c08ab4f40793c87ac890d606"} Dec 05 23:45:03 crc kubenswrapper[4734]: I1205 23:45:03.720492 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="308288dd7c8e00623ac8c7404af6424b20d78372c08ab4f40793c87ac890d606" Dec 05 23:45:03 crc kubenswrapper[4734]: I1205 23:45:03.719997 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416305-th7b6" Dec 05 23:45:11 crc kubenswrapper[4734]: I1205 23:45:11.356699 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jbcx6"] Dec 05 23:45:11 crc kubenswrapper[4734]: E1205 23:45:11.358999 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4980d445-086f-4a87-9cfa-b5b4e6196a09" containerName="collect-profiles" Dec 05 23:45:11 crc kubenswrapper[4734]: I1205 23:45:11.359111 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="4980d445-086f-4a87-9cfa-b5b4e6196a09" containerName="collect-profiles" Dec 05 23:45:11 crc kubenswrapper[4734]: I1205 23:45:11.359429 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="4980d445-086f-4a87-9cfa-b5b4e6196a09" containerName="collect-profiles" Dec 05 23:45:11 crc kubenswrapper[4734]: I1205 23:45:11.361382 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jbcx6" Dec 05 23:45:11 crc kubenswrapper[4734]: I1205 23:45:11.374048 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jbcx6"] Dec 05 23:45:11 crc kubenswrapper[4734]: I1205 23:45:11.426572 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06fe0661-7bf7-4f08-aa80-14fb6b7ee841-utilities\") pod \"community-operators-jbcx6\" (UID: \"06fe0661-7bf7-4f08-aa80-14fb6b7ee841\") " pod="openshift-marketplace/community-operators-jbcx6" Dec 05 23:45:11 crc kubenswrapper[4734]: I1205 23:45:11.426789 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06fe0661-7bf7-4f08-aa80-14fb6b7ee841-catalog-content\") pod \"community-operators-jbcx6\" (UID: \"06fe0661-7bf7-4f08-aa80-14fb6b7ee841\") " pod="openshift-marketplace/community-operators-jbcx6" Dec 05 23:45:11 crc kubenswrapper[4734]: I1205 23:45:11.426841 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqvg5\" (UniqueName: \"kubernetes.io/projected/06fe0661-7bf7-4f08-aa80-14fb6b7ee841-kube-api-access-zqvg5\") pod \"community-operators-jbcx6\" (UID: \"06fe0661-7bf7-4f08-aa80-14fb6b7ee841\") " pod="openshift-marketplace/community-operators-jbcx6" Dec 05 23:45:11 crc kubenswrapper[4734]: I1205 23:45:11.529230 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06fe0661-7bf7-4f08-aa80-14fb6b7ee841-catalog-content\") pod \"community-operators-jbcx6\" (UID: \"06fe0661-7bf7-4f08-aa80-14fb6b7ee841\") " pod="openshift-marketplace/community-operators-jbcx6" Dec 05 23:45:11 crc kubenswrapper[4734]: I1205 23:45:11.529585 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqvg5\" (UniqueName: \"kubernetes.io/projected/06fe0661-7bf7-4f08-aa80-14fb6b7ee841-kube-api-access-zqvg5\") pod \"community-operators-jbcx6\" (UID: \"06fe0661-7bf7-4f08-aa80-14fb6b7ee841\") " pod="openshift-marketplace/community-operators-jbcx6" Dec 05 23:45:11 crc kubenswrapper[4734]: I1205 23:45:11.529788 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06fe0661-7bf7-4f08-aa80-14fb6b7ee841-utilities\") pod \"community-operators-jbcx6\" (UID: \"06fe0661-7bf7-4f08-aa80-14fb6b7ee841\") " pod="openshift-marketplace/community-operators-jbcx6" Dec 05 23:45:11 crc kubenswrapper[4734]: I1205 23:45:11.530084 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06fe0661-7bf7-4f08-aa80-14fb6b7ee841-catalog-content\") pod \"community-operators-jbcx6\" (UID: \"06fe0661-7bf7-4f08-aa80-14fb6b7ee841\") " pod="openshift-marketplace/community-operators-jbcx6" Dec 05 23:45:11 crc kubenswrapper[4734]: I1205 23:45:11.530142 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06fe0661-7bf7-4f08-aa80-14fb6b7ee841-utilities\") pod \"community-operators-jbcx6\" (UID: \"06fe0661-7bf7-4f08-aa80-14fb6b7ee841\") " pod="openshift-marketplace/community-operators-jbcx6" Dec 05 23:45:11 crc kubenswrapper[4734]: I1205 23:45:11.562144 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqvg5\" (UniqueName: \"kubernetes.io/projected/06fe0661-7bf7-4f08-aa80-14fb6b7ee841-kube-api-access-zqvg5\") pod \"community-operators-jbcx6\" (UID: \"06fe0661-7bf7-4f08-aa80-14fb6b7ee841\") " pod="openshift-marketplace/community-operators-jbcx6" Dec 05 23:45:11 crc kubenswrapper[4734]: I1205 23:45:11.690464 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jbcx6" Dec 05 23:45:12 crc kubenswrapper[4734]: I1205 23:45:12.258776 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jbcx6"] Dec 05 23:45:12 crc kubenswrapper[4734]: I1205 23:45:12.831101 4734 generic.go:334] "Generic (PLEG): container finished" podID="06fe0661-7bf7-4f08-aa80-14fb6b7ee841" containerID="0891207bc2830d2d42212b8eedba6e39861627ec1abe38f4497aeb10177aa323" exitCode=0 Dec 05 23:45:12 crc kubenswrapper[4734]: I1205 23:45:12.831437 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbcx6" event={"ID":"06fe0661-7bf7-4f08-aa80-14fb6b7ee841","Type":"ContainerDied","Data":"0891207bc2830d2d42212b8eedba6e39861627ec1abe38f4497aeb10177aa323"} Dec 05 23:45:12 crc kubenswrapper[4734]: I1205 23:45:12.831476 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbcx6" event={"ID":"06fe0661-7bf7-4f08-aa80-14fb6b7ee841","Type":"ContainerStarted","Data":"9171f0029beb8e166feca95bc593108d9a80233071bb8441cd6458b207adff3d"} Dec 05 23:45:14 crc kubenswrapper[4734]: I1205 23:45:14.853415 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbcx6" event={"ID":"06fe0661-7bf7-4f08-aa80-14fb6b7ee841","Type":"ContainerStarted","Data":"bec55f5f18a35d43fe937b0dc9b8353b04e1352885a8fbbceec3c335eda60fd6"} Dec 05 23:45:15 crc kubenswrapper[4734]: I1205 23:45:15.866452 4734 generic.go:334] "Generic (PLEG): container finished" podID="06fe0661-7bf7-4f08-aa80-14fb6b7ee841" containerID="bec55f5f18a35d43fe937b0dc9b8353b04e1352885a8fbbceec3c335eda60fd6" exitCode=0 Dec 05 23:45:15 crc kubenswrapper[4734]: I1205 23:45:15.866536 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbcx6" event={"ID":"06fe0661-7bf7-4f08-aa80-14fb6b7ee841","Type":"ContainerDied","Data":"bec55f5f18a35d43fe937b0dc9b8353b04e1352885a8fbbceec3c335eda60fd6"} Dec 05 23:45:16 crc kubenswrapper[4734]: I1205 23:45:16.879999 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbcx6" event={"ID":"06fe0661-7bf7-4f08-aa80-14fb6b7ee841","Type":"ContainerStarted","Data":"6de4cf91cda98d53addaf37739c09c478234eec14caccea15d2613e9bd71fd51"} Dec 05 23:45:16 crc kubenswrapper[4734]: I1205 23:45:16.904697 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jbcx6" podStartSLOduration=2.235260992 podStartE2EDuration="5.904674536s" podCreationTimestamp="2025-12-05 23:45:11 +0000 UTC" firstStartedPulling="2025-12-05 23:45:12.83451453 +0000 UTC m=+1533.517918806" lastFinishedPulling="2025-12-05 23:45:16.503928074 +0000 UTC m=+1537.187332350" observedRunningTime="2025-12-05 23:45:16.903385625 +0000 UTC m=+1537.586789901" watchObservedRunningTime="2025-12-05 23:45:16.904674536 +0000 UTC m=+1537.588078812" Dec 05 23:45:20 crc kubenswrapper[4734]: I1205 23:45:20.444570 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:45:20 crc kubenswrapper[4734]: I1205 23:45:20.445347 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:45:21 crc kubenswrapper[4734]: I1205 23:45:21.691228 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jbcx6" Dec 05 23:45:21 crc kubenswrapper[4734]: I1205 23:45:21.691824 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jbcx6" Dec 05 23:45:21 crc kubenswrapper[4734]: I1205 23:45:21.738269 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jbcx6" Dec 05 23:45:21 crc kubenswrapper[4734]: I1205 23:45:21.985042 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jbcx6" Dec 05 23:45:22 crc kubenswrapper[4734]: I1205 23:45:22.042798 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jbcx6"] Dec 05 23:45:23 crc kubenswrapper[4734]: I1205 23:45:23.950174 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jbcx6" podUID="06fe0661-7bf7-4f08-aa80-14fb6b7ee841" containerName="registry-server" containerID="cri-o://6de4cf91cda98d53addaf37739c09c478234eec14caccea15d2613e9bd71fd51" gracePeriod=2 Dec 05 23:45:24 crc kubenswrapper[4734]: I1205 23:45:24.976901 4734 generic.go:334] "Generic (PLEG): container finished" podID="06fe0661-7bf7-4f08-aa80-14fb6b7ee841" containerID="6de4cf91cda98d53addaf37739c09c478234eec14caccea15d2613e9bd71fd51" exitCode=0 Dec 05 23:45:24 crc kubenswrapper[4734]: I1205 23:45:24.976969 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbcx6" event={"ID":"06fe0661-7bf7-4f08-aa80-14fb6b7ee841","Type":"ContainerDied","Data":"6de4cf91cda98d53addaf37739c09c478234eec14caccea15d2613e9bd71fd51"} Dec 05 23:45:24 crc kubenswrapper[4734]: I1205 23:45:24.977444 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbcx6" event={"ID":"06fe0661-7bf7-4f08-aa80-14fb6b7ee841","Type":"ContainerDied","Data":"9171f0029beb8e166feca95bc593108d9a80233071bb8441cd6458b207adff3d"} Dec 05 23:45:24 crc kubenswrapper[4734]: I1205 23:45:24.977477 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9171f0029beb8e166feca95bc593108d9a80233071bb8441cd6458b207adff3d" Dec 05 23:45:24 crc kubenswrapper[4734]: I1205 23:45:24.980382 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jbcx6" Dec 05 23:45:25 crc kubenswrapper[4734]: I1205 23:45:25.077255 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06fe0661-7bf7-4f08-aa80-14fb6b7ee841-catalog-content\") pod \"06fe0661-7bf7-4f08-aa80-14fb6b7ee841\" (UID: \"06fe0661-7bf7-4f08-aa80-14fb6b7ee841\") " Dec 05 23:45:25 crc kubenswrapper[4734]: I1205 23:45:25.077504 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqvg5\" (UniqueName: \"kubernetes.io/projected/06fe0661-7bf7-4f08-aa80-14fb6b7ee841-kube-api-access-zqvg5\") pod \"06fe0661-7bf7-4f08-aa80-14fb6b7ee841\" (UID: \"06fe0661-7bf7-4f08-aa80-14fb6b7ee841\") " Dec 05 23:45:25 crc kubenswrapper[4734]: I1205 23:45:25.077563 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06fe0661-7bf7-4f08-aa80-14fb6b7ee841-utilities\") pod \"06fe0661-7bf7-4f08-aa80-14fb6b7ee841\" (UID: \"06fe0661-7bf7-4f08-aa80-14fb6b7ee841\") " Dec 05 23:45:25 crc kubenswrapper[4734]: I1205 23:45:25.078624 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06fe0661-7bf7-4f08-aa80-14fb6b7ee841-utilities" (OuterVolumeSpecName: "utilities") pod "06fe0661-7bf7-4f08-aa80-14fb6b7ee841" (UID: "06fe0661-7bf7-4f08-aa80-14fb6b7ee841"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:45:25 crc kubenswrapper[4734]: I1205 23:45:25.084215 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06fe0661-7bf7-4f08-aa80-14fb6b7ee841-kube-api-access-zqvg5" (OuterVolumeSpecName: "kube-api-access-zqvg5") pod "06fe0661-7bf7-4f08-aa80-14fb6b7ee841" (UID: "06fe0661-7bf7-4f08-aa80-14fb6b7ee841"). InnerVolumeSpecName "kube-api-access-zqvg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:45:25 crc kubenswrapper[4734]: I1205 23:45:25.132456 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06fe0661-7bf7-4f08-aa80-14fb6b7ee841-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "06fe0661-7bf7-4f08-aa80-14fb6b7ee841" (UID: "06fe0661-7bf7-4f08-aa80-14fb6b7ee841"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:45:25 crc kubenswrapper[4734]: I1205 23:45:25.180003 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqvg5\" (UniqueName: \"kubernetes.io/projected/06fe0661-7bf7-4f08-aa80-14fb6b7ee841-kube-api-access-zqvg5\") on node \"crc\" DevicePath \"\"" Dec 05 23:45:25 crc kubenswrapper[4734]: I1205 23:45:25.180044 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06fe0661-7bf7-4f08-aa80-14fb6b7ee841-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:45:25 crc kubenswrapper[4734]: I1205 23:45:25.180056 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06fe0661-7bf7-4f08-aa80-14fb6b7ee841-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:45:25 crc kubenswrapper[4734]: I1205 23:45:25.988175 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jbcx6" Dec 05 23:45:26 crc kubenswrapper[4734]: I1205 23:45:26.049838 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jbcx6"] Dec 05 23:45:26 crc kubenswrapper[4734]: I1205 23:45:26.059987 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jbcx6"] Dec 05 23:45:27 crc kubenswrapper[4734]: I1205 23:45:27.627346 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06fe0661-7bf7-4f08-aa80-14fb6b7ee841" path="/var/lib/kubelet/pods/06fe0661-7bf7-4f08-aa80-14fb6b7ee841/volumes" Dec 05 23:45:50 crc kubenswrapper[4734]: I1205 23:45:50.446020 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:45:50 crc kubenswrapper[4734]: I1205 23:45:50.447208 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:46:20 crc kubenswrapper[4734]: I1205 23:46:20.444809 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:46:20 crc kubenswrapper[4734]: I1205 23:46:20.445913 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:46:20 crc kubenswrapper[4734]: I1205 23:46:20.446012 4734 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 05 23:46:20 crc kubenswrapper[4734]: I1205 23:46:20.447385 4734 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec"} pod="openshift-machine-config-operator/machine-config-daemon-vn94d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 23:46:20 crc kubenswrapper[4734]: I1205 23:46:20.447477 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" containerID="cri-o://bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec" gracePeriod=600 Dec 05 23:46:20 crc kubenswrapper[4734]: E1205 23:46:20.575241 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:46:20 crc kubenswrapper[4734]: I1205 23:46:20.584082 4734 generic.go:334] "Generic (PLEG): container finished" podID="65758270-a7a7-46b5-af95-0588daf9fa86" containerID="bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec" exitCode=0 Dec 05 23:46:20 crc kubenswrapper[4734]: I1205 23:46:20.584120 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerDied","Data":"bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec"} Dec 05 23:46:20 crc kubenswrapper[4734]: I1205 23:46:20.584177 4734 scope.go:117] "RemoveContainer" containerID="b56de5effd3c2004c857decd42f072613bd8b7411853b07107e3e799cc6c9cfb" Dec 05 23:46:20 crc kubenswrapper[4734]: I1205 23:46:20.585833 4734 scope.go:117] "RemoveContainer" containerID="bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec" Dec 05 23:46:20 crc kubenswrapper[4734]: E1205 23:46:20.586215 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:46:35 crc kubenswrapper[4734]: I1205 23:46:35.614411 4734 scope.go:117] "RemoveContainer" containerID="bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec" Dec 05 23:46:35 crc kubenswrapper[4734]: E1205 23:46:35.615506 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:46:45 crc kubenswrapper[4734]: I1205 23:46:45.039580 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5c25w"] Dec 05 23:46:45 crc kubenswrapper[4734]: E1205 23:46:45.040896 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06fe0661-7bf7-4f08-aa80-14fb6b7ee841" containerName="extract-content" Dec 05 23:46:45 crc kubenswrapper[4734]: I1205 23:46:45.040914 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="06fe0661-7bf7-4f08-aa80-14fb6b7ee841" containerName="extract-content" Dec 05 23:46:45 crc kubenswrapper[4734]: E1205 23:46:45.040967 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06fe0661-7bf7-4f08-aa80-14fb6b7ee841" containerName="extract-utilities" Dec 05 23:46:45 crc kubenswrapper[4734]: I1205 23:46:45.040975 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="06fe0661-7bf7-4f08-aa80-14fb6b7ee841" containerName="extract-utilities" Dec 05 23:46:45 crc kubenswrapper[4734]: E1205 23:46:45.041000 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06fe0661-7bf7-4f08-aa80-14fb6b7ee841" containerName="registry-server" Dec 05 23:46:45 crc kubenswrapper[4734]: I1205 23:46:45.041007 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="06fe0661-7bf7-4f08-aa80-14fb6b7ee841" containerName="registry-server" Dec 05 23:46:45 crc kubenswrapper[4734]: I1205 23:46:45.041242 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="06fe0661-7bf7-4f08-aa80-14fb6b7ee841" containerName="registry-server" Dec 05 23:46:45 crc kubenswrapper[4734]: I1205 23:46:45.042986 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5c25w" Dec 05 23:46:45 crc kubenswrapper[4734]: I1205 23:46:45.050246 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5c25w"] Dec 05 23:46:45 crc kubenswrapper[4734]: I1205 23:46:45.189539 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e690a56c-2672-44b8-bbc2-c1bba6fca35f-utilities\") pod \"certified-operators-5c25w\" (UID: \"e690a56c-2672-44b8-bbc2-c1bba6fca35f\") " pod="openshift-marketplace/certified-operators-5c25w" Dec 05 23:46:45 crc kubenswrapper[4734]: I1205 23:46:45.189819 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e690a56c-2672-44b8-bbc2-c1bba6fca35f-catalog-content\") pod \"certified-operators-5c25w\" (UID: \"e690a56c-2672-44b8-bbc2-c1bba6fca35f\") " pod="openshift-marketplace/certified-operators-5c25w" Dec 05 23:46:45 crc kubenswrapper[4734]: I1205 23:46:45.189937 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2swzx\" (UniqueName: \"kubernetes.io/projected/e690a56c-2672-44b8-bbc2-c1bba6fca35f-kube-api-access-2swzx\") pod \"certified-operators-5c25w\" (UID: \"e690a56c-2672-44b8-bbc2-c1bba6fca35f\") " pod="openshift-marketplace/certified-operators-5c25w" Dec 05 23:46:45 crc kubenswrapper[4734]: I1205 23:46:45.292719 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e690a56c-2672-44b8-bbc2-c1bba6fca35f-utilities\") pod \"certified-operators-5c25w\" (UID: \"e690a56c-2672-44b8-bbc2-c1bba6fca35f\") " pod="openshift-marketplace/certified-operators-5c25w" Dec 05 23:46:45 crc kubenswrapper[4734]: I1205 23:46:45.292851 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e690a56c-2672-44b8-bbc2-c1bba6fca35f-catalog-content\") pod \"certified-operators-5c25w\" (UID: \"e690a56c-2672-44b8-bbc2-c1bba6fca35f\") " pod="openshift-marketplace/certified-operators-5c25w" Dec 05 23:46:45 crc kubenswrapper[4734]: I1205 23:46:45.292897 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2swzx\" (UniqueName: \"kubernetes.io/projected/e690a56c-2672-44b8-bbc2-c1bba6fca35f-kube-api-access-2swzx\") pod \"certified-operators-5c25w\" (UID: \"e690a56c-2672-44b8-bbc2-c1bba6fca35f\") " pod="openshift-marketplace/certified-operators-5c25w" Dec 05 23:46:45 crc kubenswrapper[4734]: I1205 23:46:45.293322 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e690a56c-2672-44b8-bbc2-c1bba6fca35f-utilities\") pod \"certified-operators-5c25w\" (UID: \"e690a56c-2672-44b8-bbc2-c1bba6fca35f\") " pod="openshift-marketplace/certified-operators-5c25w" Dec 05 23:46:45 crc kubenswrapper[4734]: I1205 23:46:45.293472 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e690a56c-2672-44b8-bbc2-c1bba6fca35f-catalog-content\") pod \"certified-operators-5c25w\" (UID: \"e690a56c-2672-44b8-bbc2-c1bba6fca35f\") " pod="openshift-marketplace/certified-operators-5c25w" Dec 05 23:46:45 crc kubenswrapper[4734]: I1205 23:46:45.318286 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2swzx\" (UniqueName: \"kubernetes.io/projected/e690a56c-2672-44b8-bbc2-c1bba6fca35f-kube-api-access-2swzx\") pod \"certified-operators-5c25w\" (UID: \"e690a56c-2672-44b8-bbc2-c1bba6fca35f\") " pod="openshift-marketplace/certified-operators-5c25w" Dec 05 23:46:45 crc kubenswrapper[4734]: I1205 23:46:45.398965 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5c25w" Dec 05 23:46:45 crc kubenswrapper[4734]: I1205 23:46:45.909608 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5c25w"] Dec 05 23:46:46 crc kubenswrapper[4734]: I1205 23:46:46.865432 4734 generic.go:334] "Generic (PLEG): container finished" podID="e690a56c-2672-44b8-bbc2-c1bba6fca35f" containerID="4455cb8ed7a5facca4cc5dcd166387b2bb2f0e094aafd4e84d295a50e916be85" exitCode=0 Dec 05 23:46:46 crc kubenswrapper[4734]: I1205 23:46:46.865520 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5c25w" event={"ID":"e690a56c-2672-44b8-bbc2-c1bba6fca35f","Type":"ContainerDied","Data":"4455cb8ed7a5facca4cc5dcd166387b2bb2f0e094aafd4e84d295a50e916be85"} Dec 05 23:46:46 crc kubenswrapper[4734]: I1205 23:46:46.865922 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5c25w" event={"ID":"e690a56c-2672-44b8-bbc2-c1bba6fca35f","Type":"ContainerStarted","Data":"2e4b837e5284310ff7807d453b09261ac992f442527c746a1cfac2b8abb33d21"} Dec 05 23:46:46 crc kubenswrapper[4734]: I1205 23:46:46.867999 4734 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 23:46:47 crc kubenswrapper[4734]: I1205 23:46:47.879283 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5c25w" event={"ID":"e690a56c-2672-44b8-bbc2-c1bba6fca35f","Type":"ContainerStarted","Data":"3b23f1635d9ad083c1c8d8998f7fff698cfde5501e078999cc7b4b29452b6fe7"} Dec 05 23:46:48 crc kubenswrapper[4734]: I1205 23:46:48.893144 4734 generic.go:334] "Generic (PLEG): container finished" podID="e690a56c-2672-44b8-bbc2-c1bba6fca35f" containerID="3b23f1635d9ad083c1c8d8998f7fff698cfde5501e078999cc7b4b29452b6fe7" exitCode=0 Dec 05 23:46:48 crc kubenswrapper[4734]: I1205 23:46:48.893211 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5c25w" event={"ID":"e690a56c-2672-44b8-bbc2-c1bba6fca35f","Type":"ContainerDied","Data":"3b23f1635d9ad083c1c8d8998f7fff698cfde5501e078999cc7b4b29452b6fe7"} Dec 05 23:46:49 crc kubenswrapper[4734]: I1205 23:46:49.906880 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5c25w" event={"ID":"e690a56c-2672-44b8-bbc2-c1bba6fca35f","Type":"ContainerStarted","Data":"4b4ede0d17b899a3a27d1807d6ab4adfbde81e0e67f6214de16bed989d984cbe"} Dec 05 23:46:49 crc kubenswrapper[4734]: I1205 23:46:49.936448 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5c25w" podStartSLOduration=2.447413281 podStartE2EDuration="4.93641993s" podCreationTimestamp="2025-12-05 23:46:45 +0000 UTC" firstStartedPulling="2025-12-05 23:46:46.867642087 +0000 UTC m=+1627.551046373" lastFinishedPulling="2025-12-05 23:46:49.356648746 +0000 UTC m=+1630.040053022" observedRunningTime="2025-12-05 23:46:49.93146868 +0000 UTC m=+1630.614872956" watchObservedRunningTime="2025-12-05 23:46:49.93641993 +0000 UTC m=+1630.619824206" Dec 05 23:46:50 crc kubenswrapper[4734]: I1205 23:46:50.615428 4734 scope.go:117] "RemoveContainer" containerID="bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec" Dec 05 23:46:50 crc kubenswrapper[4734]: E1205 23:46:50.616255 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:46:55 crc kubenswrapper[4734]: I1205 23:46:55.399668 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5c25w" Dec 05 23:46:55 crc kubenswrapper[4734]: I1205 23:46:55.400458 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5c25w" Dec 05 23:46:55 crc kubenswrapper[4734]: I1205 23:46:55.457676 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5c25w" Dec 05 23:46:56 crc kubenswrapper[4734]: I1205 23:46:56.028643 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5c25w" Dec 05 23:46:56 crc kubenswrapper[4734]: I1205 23:46:56.089875 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5c25w"] Dec 05 23:46:57 crc kubenswrapper[4734]: I1205 23:46:57.989115 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5c25w" podUID="e690a56c-2672-44b8-bbc2-c1bba6fca35f" containerName="registry-server" containerID="cri-o://4b4ede0d17b899a3a27d1807d6ab4adfbde81e0e67f6214de16bed989d984cbe" gracePeriod=2 Dec 05 23:46:58 crc kubenswrapper[4734]: I1205 23:46:58.562243 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5c25w" Dec 05 23:46:58 crc kubenswrapper[4734]: I1205 23:46:58.699263 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2swzx\" (UniqueName: \"kubernetes.io/projected/e690a56c-2672-44b8-bbc2-c1bba6fca35f-kube-api-access-2swzx\") pod \"e690a56c-2672-44b8-bbc2-c1bba6fca35f\" (UID: \"e690a56c-2672-44b8-bbc2-c1bba6fca35f\") " Dec 05 23:46:58 crc kubenswrapper[4734]: I1205 23:46:58.699661 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e690a56c-2672-44b8-bbc2-c1bba6fca35f-catalog-content\") pod \"e690a56c-2672-44b8-bbc2-c1bba6fca35f\" (UID: \"e690a56c-2672-44b8-bbc2-c1bba6fca35f\") " Dec 05 23:46:58 crc kubenswrapper[4734]: I1205 23:46:58.699769 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e690a56c-2672-44b8-bbc2-c1bba6fca35f-utilities\") pod \"e690a56c-2672-44b8-bbc2-c1bba6fca35f\" (UID: \"e690a56c-2672-44b8-bbc2-c1bba6fca35f\") " Dec 05 23:46:58 crc kubenswrapper[4734]: I1205 23:46:58.703248 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e690a56c-2672-44b8-bbc2-c1bba6fca35f-utilities" (OuterVolumeSpecName: "utilities") pod "e690a56c-2672-44b8-bbc2-c1bba6fca35f" (UID: "e690a56c-2672-44b8-bbc2-c1bba6fca35f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:46:58 crc kubenswrapper[4734]: I1205 23:46:58.708899 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e690a56c-2672-44b8-bbc2-c1bba6fca35f-kube-api-access-2swzx" (OuterVolumeSpecName: "kube-api-access-2swzx") pod "e690a56c-2672-44b8-bbc2-c1bba6fca35f" (UID: "e690a56c-2672-44b8-bbc2-c1bba6fca35f"). InnerVolumeSpecName "kube-api-access-2swzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:46:58 crc kubenswrapper[4734]: I1205 23:46:58.766875 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e690a56c-2672-44b8-bbc2-c1bba6fca35f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e690a56c-2672-44b8-bbc2-c1bba6fca35f" (UID: "e690a56c-2672-44b8-bbc2-c1bba6fca35f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:46:58 crc kubenswrapper[4734]: I1205 23:46:58.803508 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e690a56c-2672-44b8-bbc2-c1bba6fca35f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:46:58 crc kubenswrapper[4734]: I1205 23:46:58.803601 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e690a56c-2672-44b8-bbc2-c1bba6fca35f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:46:58 crc kubenswrapper[4734]: I1205 23:46:58.803617 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2swzx\" (UniqueName: \"kubernetes.io/projected/e690a56c-2672-44b8-bbc2-c1bba6fca35f-kube-api-access-2swzx\") on node \"crc\" DevicePath \"\"" Dec 05 23:46:59 crc kubenswrapper[4734]: I1205 23:46:59.003434 4734 generic.go:334] "Generic (PLEG): container finished" podID="e690a56c-2672-44b8-bbc2-c1bba6fca35f" containerID="4b4ede0d17b899a3a27d1807d6ab4adfbde81e0e67f6214de16bed989d984cbe" exitCode=0 Dec 05 23:46:59 crc kubenswrapper[4734]: I1205 23:46:59.003506 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5c25w" event={"ID":"e690a56c-2672-44b8-bbc2-c1bba6fca35f","Type":"ContainerDied","Data":"4b4ede0d17b899a3a27d1807d6ab4adfbde81e0e67f6214de16bed989d984cbe"} Dec 05 23:46:59 crc kubenswrapper[4734]: I1205 23:46:59.003589 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5c25w" Dec 05 23:46:59 crc kubenswrapper[4734]: I1205 23:46:59.003999 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5c25w" event={"ID":"e690a56c-2672-44b8-bbc2-c1bba6fca35f","Type":"ContainerDied","Data":"2e4b837e5284310ff7807d453b09261ac992f442527c746a1cfac2b8abb33d21"} Dec 05 23:46:59 crc kubenswrapper[4734]: I1205 23:46:59.004040 4734 scope.go:117] "RemoveContainer" containerID="4b4ede0d17b899a3a27d1807d6ab4adfbde81e0e67f6214de16bed989d984cbe" Dec 05 23:46:59 crc kubenswrapper[4734]: I1205 23:46:59.044791 4734 scope.go:117] "RemoveContainer" containerID="3b23f1635d9ad083c1c8d8998f7fff698cfde5501e078999cc7b4b29452b6fe7" Dec 05 23:46:59 crc kubenswrapper[4734]: I1205 23:46:59.050693 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5c25w"] Dec 05 23:46:59 crc kubenswrapper[4734]: I1205 23:46:59.064268 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5c25w"] Dec 05 23:46:59 crc kubenswrapper[4734]: I1205 23:46:59.083351 4734 scope.go:117] "RemoveContainer" containerID="4455cb8ed7a5facca4cc5dcd166387b2bb2f0e094aafd4e84d295a50e916be85" Dec 05 23:46:59 crc kubenswrapper[4734]: I1205 23:46:59.145275 4734 scope.go:117] "RemoveContainer" containerID="4b4ede0d17b899a3a27d1807d6ab4adfbde81e0e67f6214de16bed989d984cbe" Dec 05 23:46:59 crc kubenswrapper[4734]: E1205 23:46:59.146161 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b4ede0d17b899a3a27d1807d6ab4adfbde81e0e67f6214de16bed989d984cbe\": container with ID starting with 4b4ede0d17b899a3a27d1807d6ab4adfbde81e0e67f6214de16bed989d984cbe not found: ID does not exist" containerID="4b4ede0d17b899a3a27d1807d6ab4adfbde81e0e67f6214de16bed989d984cbe" Dec 05 23:46:59 crc kubenswrapper[4734]: I1205 23:46:59.146223 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b4ede0d17b899a3a27d1807d6ab4adfbde81e0e67f6214de16bed989d984cbe"} err="failed to get container status \"4b4ede0d17b899a3a27d1807d6ab4adfbde81e0e67f6214de16bed989d984cbe\": rpc error: code = NotFound desc = could not find container \"4b4ede0d17b899a3a27d1807d6ab4adfbde81e0e67f6214de16bed989d984cbe\": container with ID starting with 4b4ede0d17b899a3a27d1807d6ab4adfbde81e0e67f6214de16bed989d984cbe not found: ID does not exist" Dec 05 23:46:59 crc kubenswrapper[4734]: I1205 23:46:59.146268 4734 scope.go:117] "RemoveContainer" containerID="3b23f1635d9ad083c1c8d8998f7fff698cfde5501e078999cc7b4b29452b6fe7" Dec 05 23:46:59 crc kubenswrapper[4734]: E1205 23:46:59.147049 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b23f1635d9ad083c1c8d8998f7fff698cfde5501e078999cc7b4b29452b6fe7\": container with ID starting with 3b23f1635d9ad083c1c8d8998f7fff698cfde5501e078999cc7b4b29452b6fe7 not found: ID does not exist" containerID="3b23f1635d9ad083c1c8d8998f7fff698cfde5501e078999cc7b4b29452b6fe7" Dec 05 23:46:59 crc kubenswrapper[4734]: I1205 23:46:59.147140 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b23f1635d9ad083c1c8d8998f7fff698cfde5501e078999cc7b4b29452b6fe7"} err="failed to get container status \"3b23f1635d9ad083c1c8d8998f7fff698cfde5501e078999cc7b4b29452b6fe7\": rpc error: code = NotFound desc = could not find container \"3b23f1635d9ad083c1c8d8998f7fff698cfde5501e078999cc7b4b29452b6fe7\": container with ID starting with 3b23f1635d9ad083c1c8d8998f7fff698cfde5501e078999cc7b4b29452b6fe7 not found: ID does not exist" Dec 05 23:46:59 crc kubenswrapper[4734]: I1205 23:46:59.147203 4734 scope.go:117] "RemoveContainer" containerID="4455cb8ed7a5facca4cc5dcd166387b2bb2f0e094aafd4e84d295a50e916be85" Dec 05 23:46:59 crc kubenswrapper[4734]: E1205 23:46:59.147737 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4455cb8ed7a5facca4cc5dcd166387b2bb2f0e094aafd4e84d295a50e916be85\": container with ID starting with 4455cb8ed7a5facca4cc5dcd166387b2bb2f0e094aafd4e84d295a50e916be85 not found: ID does not exist" containerID="4455cb8ed7a5facca4cc5dcd166387b2bb2f0e094aafd4e84d295a50e916be85" Dec 05 23:46:59 crc kubenswrapper[4734]: I1205 23:46:59.147792 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4455cb8ed7a5facca4cc5dcd166387b2bb2f0e094aafd4e84d295a50e916be85"} err="failed to get container status \"4455cb8ed7a5facca4cc5dcd166387b2bb2f0e094aafd4e84d295a50e916be85\": rpc error: code = NotFound desc = could not find container \"4455cb8ed7a5facca4cc5dcd166387b2bb2f0e094aafd4e84d295a50e916be85\": container with ID starting with 4455cb8ed7a5facca4cc5dcd166387b2bb2f0e094aafd4e84d295a50e916be85 not found: ID does not exist" Dec 05 23:46:59 crc kubenswrapper[4734]: I1205 23:46:59.625602 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e690a56c-2672-44b8-bbc2-c1bba6fca35f" path="/var/lib/kubelet/pods/e690a56c-2672-44b8-bbc2-c1bba6fca35f/volumes" Dec 05 23:47:02 crc kubenswrapper[4734]: I1205 23:47:02.614021 4734 scope.go:117] "RemoveContainer" containerID="bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec" Dec 05 23:47:02 crc kubenswrapper[4734]: E1205 23:47:02.615109 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:47:05 crc kubenswrapper[4734]: I1205 23:47:05.076929 4734 generic.go:334] "Generic (PLEG): container finished" podID="faef139d-614e-4c50-a383-8dd231a47b83" containerID="0e6d681385369a0b51703c270598fb237e8d0bd9d8058d34e8a4cc738ceb7c65" exitCode=0 Dec 05 23:47:05 crc kubenswrapper[4734]: I1205 23:47:05.077042 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw" event={"ID":"faef139d-614e-4c50-a383-8dd231a47b83","Type":"ContainerDied","Data":"0e6d681385369a0b51703c270598fb237e8d0bd9d8058d34e8a4cc738ceb7c65"} Dec 05 23:47:06 crc kubenswrapper[4734]: I1205 23:47:06.592697 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw" Dec 05 23:47:06 crc kubenswrapper[4734]: I1205 23:47:06.693682 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/faef139d-614e-4c50-a383-8dd231a47b83-ssh-key\") pod \"faef139d-614e-4c50-a383-8dd231a47b83\" (UID: \"faef139d-614e-4c50-a383-8dd231a47b83\") " Dec 05 23:47:06 crc kubenswrapper[4734]: I1205 23:47:06.693869 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faef139d-614e-4c50-a383-8dd231a47b83-bootstrap-combined-ca-bundle\") pod \"faef139d-614e-4c50-a383-8dd231a47b83\" (UID: \"faef139d-614e-4c50-a383-8dd231a47b83\") " Dec 05 23:47:06 crc kubenswrapper[4734]: I1205 23:47:06.693965 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p99vr\" (UniqueName: \"kubernetes.io/projected/faef139d-614e-4c50-a383-8dd231a47b83-kube-api-access-p99vr\") pod \"faef139d-614e-4c50-a383-8dd231a47b83\" (UID: \"faef139d-614e-4c50-a383-8dd231a47b83\") " Dec 05 23:47:06 crc kubenswrapper[4734]: I1205 23:47:06.694014 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/faef139d-614e-4c50-a383-8dd231a47b83-inventory\") pod \"faef139d-614e-4c50-a383-8dd231a47b83\" (UID: \"faef139d-614e-4c50-a383-8dd231a47b83\") " Dec 05 23:47:06 crc kubenswrapper[4734]: I1205 23:47:06.702267 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faef139d-614e-4c50-a383-8dd231a47b83-kube-api-access-p99vr" (OuterVolumeSpecName: "kube-api-access-p99vr") pod "faef139d-614e-4c50-a383-8dd231a47b83" (UID: "faef139d-614e-4c50-a383-8dd231a47b83"). InnerVolumeSpecName "kube-api-access-p99vr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:47:06 crc kubenswrapper[4734]: I1205 23:47:06.705926 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faef139d-614e-4c50-a383-8dd231a47b83-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "faef139d-614e-4c50-a383-8dd231a47b83" (UID: "faef139d-614e-4c50-a383-8dd231a47b83"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:47:06 crc kubenswrapper[4734]: I1205 23:47:06.733653 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faef139d-614e-4c50-a383-8dd231a47b83-inventory" (OuterVolumeSpecName: "inventory") pod "faef139d-614e-4c50-a383-8dd231a47b83" (UID: "faef139d-614e-4c50-a383-8dd231a47b83"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:47:06 crc kubenswrapper[4734]: I1205 23:47:06.733863 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faef139d-614e-4c50-a383-8dd231a47b83-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "faef139d-614e-4c50-a383-8dd231a47b83" (UID: "faef139d-614e-4c50-a383-8dd231a47b83"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:47:06 crc kubenswrapper[4734]: I1205 23:47:06.804942 4734 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faef139d-614e-4c50-a383-8dd231a47b83-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:47:06 crc kubenswrapper[4734]: I1205 23:47:06.805002 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p99vr\" (UniqueName: \"kubernetes.io/projected/faef139d-614e-4c50-a383-8dd231a47b83-kube-api-access-p99vr\") on node \"crc\" DevicePath \"\"" Dec 05 23:47:06 crc kubenswrapper[4734]: I1205 23:47:06.805036 4734 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/faef139d-614e-4c50-a383-8dd231a47b83-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 23:47:06 crc kubenswrapper[4734]: I1205 23:47:06.805052 4734 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/faef139d-614e-4c50-a383-8dd231a47b83-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 23:47:07 crc kubenswrapper[4734]: I1205 23:47:07.099846 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw" event={"ID":"faef139d-614e-4c50-a383-8dd231a47b83","Type":"ContainerDied","Data":"9b54b978335fc419391228ed882d28af6ba7c84daa620c6a83a7efaf76ebcfe6"} Dec 05 23:47:07 crc kubenswrapper[4734]: I1205 23:47:07.100740 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b54b978335fc419391228ed882d28af6ba7c84daa620c6a83a7efaf76ebcfe6" Dec 05 23:47:07 crc kubenswrapper[4734]: I1205 23:47:07.101102 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw" Dec 05 23:47:07 crc kubenswrapper[4734]: I1205 23:47:07.209589 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k"] Dec 05 23:47:07 crc kubenswrapper[4734]: E1205 23:47:07.210207 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e690a56c-2672-44b8-bbc2-c1bba6fca35f" containerName="registry-server" Dec 05 23:47:07 crc kubenswrapper[4734]: I1205 23:47:07.210238 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="e690a56c-2672-44b8-bbc2-c1bba6fca35f" containerName="registry-server" Dec 05 23:47:07 crc kubenswrapper[4734]: E1205 23:47:07.210287 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e690a56c-2672-44b8-bbc2-c1bba6fca35f" containerName="extract-content" Dec 05 23:47:07 crc kubenswrapper[4734]: I1205 23:47:07.210298 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="e690a56c-2672-44b8-bbc2-c1bba6fca35f" containerName="extract-content" Dec 05 23:47:07 crc kubenswrapper[4734]: E1205 23:47:07.210312 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faef139d-614e-4c50-a383-8dd231a47b83" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 05 23:47:07 crc kubenswrapper[4734]: I1205 23:47:07.210322 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="faef139d-614e-4c50-a383-8dd231a47b83" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 05 23:47:07 crc kubenswrapper[4734]: E1205 23:47:07.210341 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e690a56c-2672-44b8-bbc2-c1bba6fca35f" containerName="extract-utilities" Dec 05 23:47:07 crc kubenswrapper[4734]: I1205 23:47:07.210347 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="e690a56c-2672-44b8-bbc2-c1bba6fca35f" containerName="extract-utilities" Dec 05 23:47:07 crc kubenswrapper[4734]: I1205 23:47:07.210600 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="e690a56c-2672-44b8-bbc2-c1bba6fca35f" containerName="registry-server" Dec 05 23:47:07 crc kubenswrapper[4734]: I1205 23:47:07.210627 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="faef139d-614e-4c50-a383-8dd231a47b83" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 05 23:47:07 crc kubenswrapper[4734]: I1205 23:47:07.211594 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k" Dec 05 23:47:07 crc kubenswrapper[4734]: I1205 23:47:07.215199 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 23:47:07 crc kubenswrapper[4734]: I1205 23:47:07.215330 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gsdqx" Dec 05 23:47:07 crc kubenswrapper[4734]: I1205 23:47:07.215357 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 23:47:07 crc kubenswrapper[4734]: I1205 23:47:07.215740 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 23:47:07 crc kubenswrapper[4734]: I1205 23:47:07.229017 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k"] Dec 05 23:47:07 crc kubenswrapper[4734]: I1205 23:47:07.316009 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm5c9\" (UniqueName: \"kubernetes.io/projected/b881d911-43a8-4290-98e8-89e268e162e4-kube-api-access-cm5c9\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k\" (UID: \"b881d911-43a8-4290-98e8-89e268e162e4\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k" Dec 05 23:47:07 crc kubenswrapper[4734]: I1205 23:47:07.316448 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b881d911-43a8-4290-98e8-89e268e162e4-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k\" (UID: \"b881d911-43a8-4290-98e8-89e268e162e4\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k" Dec 05 23:47:07 crc kubenswrapper[4734]: I1205 23:47:07.316767 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b881d911-43a8-4290-98e8-89e268e162e4-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k\" (UID: \"b881d911-43a8-4290-98e8-89e268e162e4\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k" Dec 05 23:47:07 crc kubenswrapper[4734]: I1205 23:47:07.419597 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm5c9\" (UniqueName: \"kubernetes.io/projected/b881d911-43a8-4290-98e8-89e268e162e4-kube-api-access-cm5c9\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k\" (UID: \"b881d911-43a8-4290-98e8-89e268e162e4\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k" Dec 05 23:47:07 crc kubenswrapper[4734]: I1205 23:47:07.419795 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b881d911-43a8-4290-98e8-89e268e162e4-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k\" (UID: \"b881d911-43a8-4290-98e8-89e268e162e4\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k" Dec 05 23:47:07 crc kubenswrapper[4734]: I1205 23:47:07.419944 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b881d911-43a8-4290-98e8-89e268e162e4-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k\" (UID: \"b881d911-43a8-4290-98e8-89e268e162e4\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k" Dec 05 23:47:07 crc kubenswrapper[4734]: I1205 23:47:07.427192 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b881d911-43a8-4290-98e8-89e268e162e4-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k\" (UID: \"b881d911-43a8-4290-98e8-89e268e162e4\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k" Dec 05 23:47:07 crc kubenswrapper[4734]: I1205 23:47:07.427401 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b881d911-43a8-4290-98e8-89e268e162e4-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k\" (UID: \"b881d911-43a8-4290-98e8-89e268e162e4\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k" Dec 05 23:47:07 crc kubenswrapper[4734]: I1205 23:47:07.439765 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm5c9\" (UniqueName: \"kubernetes.io/projected/b881d911-43a8-4290-98e8-89e268e162e4-kube-api-access-cm5c9\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k\" (UID: \"b881d911-43a8-4290-98e8-89e268e162e4\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k" Dec 05 23:47:07 crc kubenswrapper[4734]: I1205 23:47:07.535096 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k" Dec 05 23:47:07 crc kubenswrapper[4734]: I1205 23:47:07.932673 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k"] Dec 05 23:47:08 crc kubenswrapper[4734]: I1205 23:47:08.112708 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k" event={"ID":"b881d911-43a8-4290-98e8-89e268e162e4","Type":"ContainerStarted","Data":"b4fe3b25146e00c411bf5b7b0d9b26808d2d91501dd72d93b345d4f1c429d6d5"} Dec 05 23:47:09 crc kubenswrapper[4734]: I1205 23:47:09.053151 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8463-account-create-update-psrz9"] Dec 05 23:47:09 crc kubenswrapper[4734]: I1205 23:47:09.066271 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-n8dcb"] Dec 05 23:47:09 crc kubenswrapper[4734]: I1205 23:47:09.075376 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-xm6wd"] Dec 05 23:47:09 crc kubenswrapper[4734]: I1205 23:47:09.084781 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-d0b4-account-create-update-qssx2"] Dec 05 23:47:09 crc kubenswrapper[4734]: I1205 23:47:09.094563 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-8463-account-create-update-psrz9"] Dec 05 23:47:09 crc kubenswrapper[4734]: I1205 23:47:09.102391 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-xm6wd"] Dec 05 23:47:09 crc kubenswrapper[4734]: I1205 23:47:09.111201 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-n8dcb"] Dec 05 23:47:09 crc kubenswrapper[4734]: I1205 23:47:09.120971 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-d0b4-account-create-update-qssx2"] Dec 05 23:47:09 crc kubenswrapper[4734]: I1205 23:47:09.126331 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k" event={"ID":"b881d911-43a8-4290-98e8-89e268e162e4","Type":"ContainerStarted","Data":"00d0831dec8e27e06befc50b1d58d63338a957f9891032fef5dfec7f7ad774b2"} Dec 05 23:47:09 crc kubenswrapper[4734]: I1205 23:47:09.153763 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k" podStartSLOduration=1.294772719 podStartE2EDuration="2.153732055s" podCreationTimestamp="2025-12-05 23:47:07 +0000 UTC" firstStartedPulling="2025-12-05 23:47:07.946828095 +0000 UTC m=+1648.630232381" lastFinishedPulling="2025-12-05 23:47:08.805787431 +0000 UTC m=+1649.489191717" observedRunningTime="2025-12-05 23:47:09.143120767 +0000 UTC m=+1649.826525053" watchObservedRunningTime="2025-12-05 23:47:09.153732055 +0000 UTC m=+1649.837136331" Dec 05 23:47:09 crc kubenswrapper[4734]: I1205 23:47:09.631817 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d76d646-48a4-405f-ba5e-fa7ef1775294" path="/var/lib/kubelet/pods/3d76d646-48a4-405f-ba5e-fa7ef1775294/volumes" Dec 05 23:47:09 crc kubenswrapper[4734]: I1205 23:47:09.632809 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40a48ff8-8c86-4d37-8962-bb74618c2558" path="/var/lib/kubelet/pods/40a48ff8-8c86-4d37-8962-bb74618c2558/volumes" Dec 05 23:47:09 crc kubenswrapper[4734]: I1205 23:47:09.633765 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c28ce93a-e28b-4be0-87bd-38e5dc1383df" path="/var/lib/kubelet/pods/c28ce93a-e28b-4be0-87bd-38e5dc1383df/volumes" Dec 05 23:47:09 crc kubenswrapper[4734]: I1205 23:47:09.634955 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0266b9f-86d4-462a-a20f-8897e48bfa43" path="/var/lib/kubelet/pods/f0266b9f-86d4-462a-a20f-8897e48bfa43/volumes" Dec 05 23:47:15 crc kubenswrapper[4734]: I1205 23:47:15.034704 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-s269z"] Dec 05 23:47:15 crc kubenswrapper[4734]: I1205 23:47:15.044985 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-s269z"] Dec 05 23:47:15 crc kubenswrapper[4734]: I1205 23:47:15.055339 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-351c-account-create-update-npg5f"] Dec 05 23:47:15 crc kubenswrapper[4734]: I1205 23:47:15.065358 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-351c-account-create-update-npg5f"] Dec 05 23:47:15 crc kubenswrapper[4734]: I1205 23:47:15.626835 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="027d4639-edeb-422d-b5c5-f8ecfcd704dd" path="/var/lib/kubelet/pods/027d4639-edeb-422d-b5c5-f8ecfcd704dd/volumes" Dec 05 23:47:15 crc kubenswrapper[4734]: I1205 23:47:15.627595 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9614c3a8-524e-4641-9abb-a991a9c884ae" path="/var/lib/kubelet/pods/9614c3a8-524e-4641-9abb-a991a9c884ae/volumes" Dec 05 23:47:17 crc kubenswrapper[4734]: I1205 23:47:17.614668 4734 scope.go:117] "RemoveContainer" containerID="bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec" Dec 05 23:47:17 crc kubenswrapper[4734]: E1205 23:47:17.615506 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:47:29 crc kubenswrapper[4734]: I1205 23:47:29.625386 4734 scope.go:117] "RemoveContainer" containerID="bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec" Dec 05 23:47:29 crc kubenswrapper[4734]: E1205 23:47:29.626696 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:47:41 crc kubenswrapper[4734]: I1205 23:47:41.614597 4734 scope.go:117] "RemoveContainer" containerID="bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec" Dec 05 23:47:41 crc kubenswrapper[4734]: E1205 23:47:41.615624 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:47:50 crc kubenswrapper[4734]: I1205 23:47:50.051471 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-e6fc-account-create-update-whgg9"] Dec 05 23:47:50 crc kubenswrapper[4734]: I1205 23:47:50.057862 4734 scope.go:117] "RemoveContainer" containerID="64cdede5a62c626b94ec4a2cfded9bdb990ab81051608b09b39824cea9f2859c" Dec 05 23:47:50 crc kubenswrapper[4734]: I1205 23:47:50.071710 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-s5w2r"] Dec 05 23:47:50 crc kubenswrapper[4734]: I1205 23:47:50.091970 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-a7d4-account-create-update-6mgtp"] Dec 05 23:47:50 crc kubenswrapper[4734]: I1205 23:47:50.099095 4734 scope.go:117] "RemoveContainer" containerID="cdac87b363b6aea38969a00571152c140ad2cb5e61e04cec9c43b444ed891cbf" Dec 05 23:47:50 crc kubenswrapper[4734]: I1205 23:47:50.104152 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-bjk8x"] Dec 05 23:47:50 crc kubenswrapper[4734]: I1205 23:47:50.114079 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f18c-account-create-update-qp7p8"] Dec 05 23:47:50 crc kubenswrapper[4734]: I1205 23:47:50.125989 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-6zn8b"] Dec 05 23:47:50 crc kubenswrapper[4734]: I1205 23:47:50.141132 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-s5w2r"] Dec 05 23:47:50 crc kubenswrapper[4734]: I1205 23:47:50.150796 4734 scope.go:117] "RemoveContainer" containerID="7d6874eb4a8f97ea3a945ced508aaf272a8cce2f377d304d99de29b142e4fec6" Dec 05 23:47:50 crc kubenswrapper[4734]: I1205 23:47:50.154995 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-bjk8x"] Dec 05 23:47:50 crc kubenswrapper[4734]: I1205 23:47:50.197076 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-a7d4-account-create-update-6mgtp"] Dec 05 23:47:50 crc kubenswrapper[4734]: I1205 23:47:50.211922 4734 scope.go:117] "RemoveContainer" containerID="9b4adfba9cb2e0ec28fc8f79440f61f1e28013f2db74580cc72374cbd7221c7e" Dec 05 23:47:50 crc kubenswrapper[4734]: I1205 23:47:50.213551 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-e6fc-account-create-update-whgg9"] Dec 05 23:47:50 crc kubenswrapper[4734]: I1205 23:47:50.224689 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f18c-account-create-update-qp7p8"] Dec 05 23:47:50 crc kubenswrapper[4734]: I1205 23:47:50.238208 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-6zn8b"] Dec 05 23:47:50 crc kubenswrapper[4734]: I1205 23:47:50.274516 4734 scope.go:117] "RemoveContainer" containerID="a74d873d0501707e58d8df6b4cc5aa70659cc956a4c62572ff2ec301f126c5ec" Dec 05 23:47:50 crc kubenswrapper[4734]: I1205 23:47:50.306490 4734 scope.go:117] "RemoveContainer" containerID="f9185ca68f816af79e3ee4414d88f12b53176d69b37e70ffedf32cdf1cb19599" Dec 05 23:47:51 crc kubenswrapper[4734]: I1205 23:47:51.629385 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f3e2e20-cc04-41cd-94df-e0748036144a" path="/var/lib/kubelet/pods/1f3e2e20-cc04-41cd-94df-e0748036144a/volumes" Dec 05 23:47:51 crc kubenswrapper[4734]: I1205 23:47:51.630905 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26927571-4070-4160-9e19-82f06d7d2a06" path="/var/lib/kubelet/pods/26927571-4070-4160-9e19-82f06d7d2a06/volumes" Dec 05 23:47:51 crc kubenswrapper[4734]: I1205 23:47:51.631907 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f5039b6-0875-4dd9-a5c4-9e6849dbb221" path="/var/lib/kubelet/pods/6f5039b6-0875-4dd9-a5c4-9e6849dbb221/volumes" Dec 05 23:47:51 crc kubenswrapper[4734]: I1205 23:47:51.632613 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e04b323-6b27-4e58-9688-a7bc57317e6e" path="/var/lib/kubelet/pods/8e04b323-6b27-4e58-9688-a7bc57317e6e/volumes" Dec 05 23:47:51 crc kubenswrapper[4734]: I1205 23:47:51.633934 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c2002ce-57e4-45bd-9110-9f7ebd50d0e7" path="/var/lib/kubelet/pods/9c2002ce-57e4-45bd-9110-9f7ebd50d0e7/volumes" Dec 05 23:47:51 crc kubenswrapper[4734]: I1205 23:47:51.634567 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe0296c8-8413-4c49-ae9c-e68e1dcbdb03" path="/var/lib/kubelet/pods/fe0296c8-8413-4c49-ae9c-e68e1dcbdb03/volumes" Dec 05 23:47:52 crc kubenswrapper[4734]: I1205 23:47:52.615091 4734 scope.go:117] "RemoveContainer" containerID="bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec" Dec 05 23:47:52 crc kubenswrapper[4734]: E1205 23:47:52.616100 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:48:01 crc kubenswrapper[4734]: I1205 23:48:01.051479 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-khd2f"] Dec 05 23:48:01 crc kubenswrapper[4734]: I1205 23:48:01.061871 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-khd2f"] Dec 05 23:48:01 crc kubenswrapper[4734]: I1205 23:48:01.627143 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="019f8649-b37e-4970-a742-33afa217a2b4" path="/var/lib/kubelet/pods/019f8649-b37e-4970-a742-33afa217a2b4/volumes" Dec 05 23:48:05 crc kubenswrapper[4734]: I1205 23:48:05.614148 4734 scope.go:117] "RemoveContainer" containerID="bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec" Dec 05 23:48:05 crc kubenswrapper[4734]: E1205 23:48:05.615020 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:48:17 crc kubenswrapper[4734]: I1205 23:48:17.614666 4734 scope.go:117] "RemoveContainer" containerID="bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec" Dec 05 23:48:17 crc kubenswrapper[4734]: E1205 23:48:17.615629 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:48:23 crc kubenswrapper[4734]: I1205 23:48:23.078765 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-wzn74"] Dec 05 23:48:23 crc kubenswrapper[4734]: I1205 23:48:23.087491 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-wzn74"] Dec 05 23:48:23 crc kubenswrapper[4734]: I1205 23:48:23.632012 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="741e9328-bc42-4fae-b3dd-316f3286fa42" path="/var/lib/kubelet/pods/741e9328-bc42-4fae-b3dd-316f3286fa42/volumes" Dec 05 23:48:30 crc kubenswrapper[4734]: I1205 23:48:30.615157 4734 scope.go:117] "RemoveContainer" containerID="bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec" Dec 05 23:48:30 crc kubenswrapper[4734]: E1205 23:48:30.616289 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:48:38 crc kubenswrapper[4734]: I1205 23:48:38.056033 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-58fkh"] Dec 05 23:48:38 crc kubenswrapper[4734]: I1205 23:48:38.066124 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-58fkh"] Dec 05 23:48:39 crc kubenswrapper[4734]: I1205 23:48:39.627191 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c427c6a-2e27-4e8d-9088-1cdad55da769" path="/var/lib/kubelet/pods/6c427c6a-2e27-4e8d-9088-1cdad55da769/volumes" Dec 05 23:48:41 crc kubenswrapper[4734]: I1205 23:48:41.614709 4734 scope.go:117] "RemoveContainer" containerID="bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec" Dec 05 23:48:41 crc kubenswrapper[4734]: E1205 23:48:41.616235 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:48:48 crc kubenswrapper[4734]: I1205 23:48:48.037762 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-zhc67"] Dec 05 23:48:48 crc kubenswrapper[4734]: I1205 23:48:48.052306 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-6x54r"] Dec 05 23:48:48 crc kubenswrapper[4734]: I1205 23:48:48.063836 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-zhc67"] Dec 05 23:48:48 crc kubenswrapper[4734]: I1205 23:48:48.072595 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-6x54r"] Dec 05 23:48:49 crc kubenswrapper[4734]: I1205 23:48:49.631128 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df84fed8-d899-47ed-a702-2fbae2f75d53" path="/var/lib/kubelet/pods/df84fed8-d899-47ed-a702-2fbae2f75d53/volumes" Dec 05 23:48:49 crc kubenswrapper[4734]: I1205 23:48:49.632027 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e97c0b2c-1294-43eb-a424-5c04e198611e" path="/var/lib/kubelet/pods/e97c0b2c-1294-43eb-a424-5c04e198611e/volumes" Dec 05 23:48:50 crc kubenswrapper[4734]: I1205 23:48:50.500374 4734 scope.go:117] "RemoveContainer" containerID="adea1b041e1d7c952fad653f11acbe0d9b04cc0b8533443a7ec38aa9b962d3bf" Dec 05 23:48:50 crc kubenswrapper[4734]: I1205 23:48:50.558746 4734 scope.go:117] "RemoveContainer" containerID="25ea3dd465d7374c67e296a8dbcfcb478d415f8e89230b9c95e81b2f9a1371ae" Dec 05 23:48:50 crc kubenswrapper[4734]: I1205 23:48:50.587214 4734 scope.go:117] "RemoveContainer" containerID="7f71bf11891772dbfb88764495aa12ed3fc0c9cfca8e941570c5f0658deb175b" Dec 05 23:48:50 crc kubenswrapper[4734]: I1205 23:48:50.646452 4734 scope.go:117] "RemoveContainer" containerID="e26cf08257b3b808b32026a3695581b91ee4cf7ccec4d0a1ebab2542ba48752b" Dec 05 23:48:50 crc kubenswrapper[4734]: I1205 23:48:50.682734 4734 scope.go:117] "RemoveContainer" containerID="ecaea013bad616b1d169bf69540826a36097fec2f10acbe2671b8ebb2f430d19" Dec 05 23:48:50 crc kubenswrapper[4734]: I1205 23:48:50.754550 4734 scope.go:117] "RemoveContainer" containerID="b97e73187b85ceeab5cfd09e4d53d4dbb45cabe0bfe6efb107a76fbc9b10d289" Dec 05 23:48:50 crc kubenswrapper[4734]: I1205 23:48:50.783737 4734 scope.go:117] "RemoveContainer" containerID="d573c95c457ec42a3ba6fba3952780a656918af85cc999b56e5043e47c97c283" Dec 05 23:48:50 crc kubenswrapper[4734]: I1205 23:48:50.837237 4734 scope.go:117] "RemoveContainer" containerID="b5a3c97982fe4691690f8cfa50f1d8dc15b8c1d0aea8f1c1c43c4490cf284e5b" Dec 05 23:48:50 crc kubenswrapper[4734]: I1205 23:48:50.860726 4734 scope.go:117] "RemoveContainer" containerID="081b81ceb2eb2e2ee332d925d7eac6f20d10155f1a6a949b002cff09c0a0ded3" Dec 05 23:48:50 crc kubenswrapper[4734]: I1205 23:48:50.887055 4734 scope.go:117] "RemoveContainer" containerID="0e4d94548c85906251bc8fe747697a3172511be772c9842c23bee31dc5c05b93" Dec 05 23:48:50 crc kubenswrapper[4734]: I1205 23:48:50.927632 4734 scope.go:117] "RemoveContainer" containerID="a3f3439c1615c6066a0c7b7bb8ed0597b8885146363f6c94f8c1196d20ff839a" Dec 05 23:48:56 crc kubenswrapper[4734]: I1205 23:48:56.614273 4734 scope.go:117] "RemoveContainer" containerID="bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec" Dec 05 23:48:56 crc kubenswrapper[4734]: E1205 23:48:56.615067 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:49:07 crc kubenswrapper[4734]: I1205 23:49:07.051951 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-f82cb"] Dec 05 23:49:07 crc kubenswrapper[4734]: I1205 23:49:07.063010 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-f82cb"] Dec 05 23:49:07 crc kubenswrapper[4734]: I1205 23:49:07.633131 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d71f9558-c417-4cc7-934f-258f388cced2" path="/var/lib/kubelet/pods/d71f9558-c417-4cc7-934f-258f388cced2/volumes" Dec 05 23:49:08 crc kubenswrapper[4734]: I1205 23:49:08.443975 4734 generic.go:334] "Generic (PLEG): container finished" podID="b881d911-43a8-4290-98e8-89e268e162e4" containerID="00d0831dec8e27e06befc50b1d58d63338a957f9891032fef5dfec7f7ad774b2" exitCode=0 Dec 05 23:49:08 crc kubenswrapper[4734]: I1205 23:49:08.444065 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k" event={"ID":"b881d911-43a8-4290-98e8-89e268e162e4","Type":"ContainerDied","Data":"00d0831dec8e27e06befc50b1d58d63338a957f9891032fef5dfec7f7ad774b2"} Dec 05 23:49:09 crc kubenswrapper[4734]: I1205 23:49:09.633597 4734 scope.go:117] "RemoveContainer" containerID="bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec" Dec 05 23:49:09 crc kubenswrapper[4734]: E1205 23:49:09.634165 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:49:09 crc kubenswrapper[4734]: I1205 23:49:09.932218 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k" Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.039342 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-xsvx9"] Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.053019 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-xsvx9"] Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.060828 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm5c9\" (UniqueName: \"kubernetes.io/projected/b881d911-43a8-4290-98e8-89e268e162e4-kube-api-access-cm5c9\") pod \"b881d911-43a8-4290-98e8-89e268e162e4\" (UID: \"b881d911-43a8-4290-98e8-89e268e162e4\") " Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.061077 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b881d911-43a8-4290-98e8-89e268e162e4-inventory\") pod \"b881d911-43a8-4290-98e8-89e268e162e4\" (UID: \"b881d911-43a8-4290-98e8-89e268e162e4\") " Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.061214 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b881d911-43a8-4290-98e8-89e268e162e4-ssh-key\") pod \"b881d911-43a8-4290-98e8-89e268e162e4\" (UID: \"b881d911-43a8-4290-98e8-89e268e162e4\") " Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.069588 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b881d911-43a8-4290-98e8-89e268e162e4-kube-api-access-cm5c9" (OuterVolumeSpecName: "kube-api-access-cm5c9") pod "b881d911-43a8-4290-98e8-89e268e162e4" (UID: "b881d911-43a8-4290-98e8-89e268e162e4"). InnerVolumeSpecName "kube-api-access-cm5c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.096830 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b881d911-43a8-4290-98e8-89e268e162e4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b881d911-43a8-4290-98e8-89e268e162e4" (UID: "b881d911-43a8-4290-98e8-89e268e162e4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.097255 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b881d911-43a8-4290-98e8-89e268e162e4-inventory" (OuterVolumeSpecName: "inventory") pod "b881d911-43a8-4290-98e8-89e268e162e4" (UID: "b881d911-43a8-4290-98e8-89e268e162e4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.164512 4734 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b881d911-43a8-4290-98e8-89e268e162e4-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.164584 4734 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b881d911-43a8-4290-98e8-89e268e162e4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.164607 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm5c9\" (UniqueName: \"kubernetes.io/projected/b881d911-43a8-4290-98e8-89e268e162e4-kube-api-access-cm5c9\") on node \"crc\" DevicePath \"\"" Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.467701 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k" event={"ID":"b881d911-43a8-4290-98e8-89e268e162e4","Type":"ContainerDied","Data":"b4fe3b25146e00c411bf5b7b0d9b26808d2d91501dd72d93b345d4f1c429d6d5"} Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.468025 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4fe3b25146e00c411bf5b7b0d9b26808d2d91501dd72d93b345d4f1c429d6d5" Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.467821 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k" Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.591175 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq"] Dec 05 23:49:10 crc kubenswrapper[4734]: E1205 23:49:10.591926 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b881d911-43a8-4290-98e8-89e268e162e4" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.591955 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="b881d911-43a8-4290-98e8-89e268e162e4" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.592172 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="b881d911-43a8-4290-98e8-89e268e162e4" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.593033 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq" Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.598086 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gsdqx" Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.599081 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.599284 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.599464 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.602157 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq"] Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.676751 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f183bc38-e046-45f6-b96a-440e596c8088-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq\" (UID: \"f183bc38-e046-45f6-b96a-440e596c8088\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq" Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.677312 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvnff\" (UniqueName: \"kubernetes.io/projected/f183bc38-e046-45f6-b96a-440e596c8088-kube-api-access-zvnff\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq\" (UID: \"f183bc38-e046-45f6-b96a-440e596c8088\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq" Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.677852 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f183bc38-e046-45f6-b96a-440e596c8088-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq\" (UID: \"f183bc38-e046-45f6-b96a-440e596c8088\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq" Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.779766 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f183bc38-e046-45f6-b96a-440e596c8088-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq\" (UID: \"f183bc38-e046-45f6-b96a-440e596c8088\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq" Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.779922 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f183bc38-e046-45f6-b96a-440e596c8088-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq\" (UID: \"f183bc38-e046-45f6-b96a-440e596c8088\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq" Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.781342 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvnff\" (UniqueName: \"kubernetes.io/projected/f183bc38-e046-45f6-b96a-440e596c8088-kube-api-access-zvnff\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq\" (UID: \"f183bc38-e046-45f6-b96a-440e596c8088\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq" Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.785563 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f183bc38-e046-45f6-b96a-440e596c8088-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq\" (UID: \"f183bc38-e046-45f6-b96a-440e596c8088\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq" Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.793210 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f183bc38-e046-45f6-b96a-440e596c8088-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq\" (UID: \"f183bc38-e046-45f6-b96a-440e596c8088\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq" Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.809428 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvnff\" (UniqueName: \"kubernetes.io/projected/f183bc38-e046-45f6-b96a-440e596c8088-kube-api-access-zvnff\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq\" (UID: \"f183bc38-e046-45f6-b96a-440e596c8088\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq" Dec 05 23:49:10 crc kubenswrapper[4734]: I1205 23:49:10.913996 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq" Dec 05 23:49:11 crc kubenswrapper[4734]: I1205 23:49:11.632890 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c26d17f-e341-41c5-9759-c0b265fcceea" path="/var/lib/kubelet/pods/4c26d17f-e341-41c5-9759-c0b265fcceea/volumes" Dec 05 23:49:11 crc kubenswrapper[4734]: I1205 23:49:11.634291 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq"] Dec 05 23:49:12 crc kubenswrapper[4734]: I1205 23:49:12.492076 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq" event={"ID":"f183bc38-e046-45f6-b96a-440e596c8088","Type":"ContainerStarted","Data":"99d2861072f4670dd9e5ff03801d70fe68edb07671be0a95677f2a2a5f5d36e7"} Dec 05 23:49:12 crc kubenswrapper[4734]: I1205 23:49:12.492874 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq" event={"ID":"f183bc38-e046-45f6-b96a-440e596c8088","Type":"ContainerStarted","Data":"bd5194a69803ad52ad45e7919f82a140541046f17e2d3e52847852bf27876a66"} Dec 05 23:49:12 crc kubenswrapper[4734]: I1205 23:49:12.517745 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq" podStartSLOduration=2.015337548 podStartE2EDuration="2.51772348s" podCreationTimestamp="2025-12-05 23:49:10 +0000 UTC" firstStartedPulling="2025-12-05 23:49:11.612835188 +0000 UTC m=+1772.296239464" lastFinishedPulling="2025-12-05 23:49:12.11522112 +0000 UTC m=+1772.798625396" observedRunningTime="2025-12-05 23:49:12.51358861 +0000 UTC m=+1773.196992886" watchObservedRunningTime="2025-12-05 23:49:12.51772348 +0000 UTC m=+1773.201127766" Dec 05 23:49:24 crc kubenswrapper[4734]: I1205 23:49:24.614517 4734 scope.go:117] "RemoveContainer" containerID="bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec" Dec 05 23:49:24 crc kubenswrapper[4734]: E1205 23:49:24.615655 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:49:38 crc kubenswrapper[4734]: I1205 23:49:38.614417 4734 scope.go:117] "RemoveContainer" containerID="bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec" Dec 05 23:49:38 crc kubenswrapper[4734]: E1205 23:49:38.617473 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:49:51 crc kubenswrapper[4734]: I1205 23:49:51.178491 4734 scope.go:117] "RemoveContainer" containerID="ad3d670f413f88dcf8202b5fcb6c9d218e25ec1a2ba6fa4ebad056845b56179f" Dec 05 23:49:51 crc kubenswrapper[4734]: I1205 23:49:51.233161 4734 scope.go:117] "RemoveContainer" containerID="1b18a6a4b18d08501789d9e21b925316c413cdbe7a4ed004b5fef4a11dfd69d1" Dec 05 23:49:52 crc kubenswrapper[4734]: I1205 23:49:52.614541 4734 scope.go:117] "RemoveContainer" containerID="bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec" Dec 05 23:49:52 crc kubenswrapper[4734]: E1205 23:49:52.615412 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:50:06 crc kubenswrapper[4734]: I1205 23:50:06.614457 4734 scope.go:117] "RemoveContainer" containerID="bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec" Dec 05 23:50:06 crc kubenswrapper[4734]: E1205 23:50:06.615487 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:50:12 crc kubenswrapper[4734]: I1205 23:50:12.060488 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-4fmf5"] Dec 05 23:50:12 crc kubenswrapper[4734]: I1205 23:50:12.073336 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-jdhd2"] Dec 05 23:50:12 crc kubenswrapper[4734]: I1205 23:50:12.082037 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-4fmf5"] Dec 05 23:50:12 crc kubenswrapper[4734]: I1205 23:50:12.089929 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-jdhd2"] Dec 05 23:50:13 crc kubenswrapper[4734]: I1205 23:50:13.034017 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-d01a-account-create-update-4smb4"] Dec 05 23:50:13 crc kubenswrapper[4734]: I1205 23:50:13.045712 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-d01a-account-create-update-4smb4"] Dec 05 23:50:13 crc kubenswrapper[4734]: I1205 23:50:13.625998 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52e79363-534e-4d0c-9cdf-86ad75fa19bb" path="/var/lib/kubelet/pods/52e79363-534e-4d0c-9cdf-86ad75fa19bb/volumes" Dec 05 23:50:13 crc kubenswrapper[4734]: I1205 23:50:13.626709 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b88ba6a0-1e12-4bd6-a483-b2522fad58f9" path="/var/lib/kubelet/pods/b88ba6a0-1e12-4bd6-a483-b2522fad58f9/volumes" Dec 05 23:50:13 crc kubenswrapper[4734]: I1205 23:50:13.627432 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0" path="/var/lib/kubelet/pods/e9ac3e25-8dd3-45d6-9899-66ad5fb9f9d0/volumes" Dec 05 23:50:14 crc kubenswrapper[4734]: I1205 23:50:14.047578 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-3f13-account-create-update-p4zg7"] Dec 05 23:50:14 crc kubenswrapper[4734]: I1205 23:50:14.061192 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-53ed-account-create-update-8tnq8"] Dec 05 23:50:14 crc kubenswrapper[4734]: I1205 23:50:14.072225 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-qrbf6"] Dec 05 23:50:14 crc kubenswrapper[4734]: I1205 23:50:14.082577 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-3f13-account-create-update-p4zg7"] Dec 05 23:50:14 crc kubenswrapper[4734]: I1205 23:50:14.099824 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-qrbf6"] Dec 05 23:50:14 crc kubenswrapper[4734]: I1205 23:50:14.110949 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-53ed-account-create-update-8tnq8"] Dec 05 23:50:15 crc kubenswrapper[4734]: I1205 23:50:15.626770 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1148161b-fe59-434d-880a-80a03b0c8ff7" path="/var/lib/kubelet/pods/1148161b-fe59-434d-880a-80a03b0c8ff7/volumes" Dec 05 23:50:15 crc kubenswrapper[4734]: I1205 23:50:15.627851 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2bc9278-e755-43b5-8fb5-91854e437360" path="/var/lib/kubelet/pods/b2bc9278-e755-43b5-8fb5-91854e437360/volumes" Dec 05 23:50:15 crc kubenswrapper[4734]: I1205 23:50:15.628428 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caa5035d-868a-4c3a-bb3e-43f7b84096f4" path="/var/lib/kubelet/pods/caa5035d-868a-4c3a-bb3e-43f7b84096f4/volumes" Dec 05 23:50:20 crc kubenswrapper[4734]: I1205 23:50:20.614224 4734 scope.go:117] "RemoveContainer" containerID="bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec" Dec 05 23:50:20 crc kubenswrapper[4734]: E1205 23:50:20.615072 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:50:33 crc kubenswrapper[4734]: I1205 23:50:33.383744 4734 generic.go:334] "Generic (PLEG): container finished" podID="f183bc38-e046-45f6-b96a-440e596c8088" containerID="99d2861072f4670dd9e5ff03801d70fe68edb07671be0a95677f2a2a5f5d36e7" exitCode=0 Dec 05 23:50:33 crc kubenswrapper[4734]: I1205 23:50:33.383832 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq" event={"ID":"f183bc38-e046-45f6-b96a-440e596c8088","Type":"ContainerDied","Data":"99d2861072f4670dd9e5ff03801d70fe68edb07671be0a95677f2a2a5f5d36e7"} Dec 05 23:50:34 crc kubenswrapper[4734]: I1205 23:50:34.848950 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq" Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.041759 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f183bc38-e046-45f6-b96a-440e596c8088-ssh-key\") pod \"f183bc38-e046-45f6-b96a-440e596c8088\" (UID: \"f183bc38-e046-45f6-b96a-440e596c8088\") " Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.041832 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f183bc38-e046-45f6-b96a-440e596c8088-inventory\") pod \"f183bc38-e046-45f6-b96a-440e596c8088\" (UID: \"f183bc38-e046-45f6-b96a-440e596c8088\") " Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.041877 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvnff\" (UniqueName: \"kubernetes.io/projected/f183bc38-e046-45f6-b96a-440e596c8088-kube-api-access-zvnff\") pod \"f183bc38-e046-45f6-b96a-440e596c8088\" (UID: \"f183bc38-e046-45f6-b96a-440e596c8088\") " Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.051307 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f183bc38-e046-45f6-b96a-440e596c8088-kube-api-access-zvnff" (OuterVolumeSpecName: "kube-api-access-zvnff") pod "f183bc38-e046-45f6-b96a-440e596c8088" (UID: "f183bc38-e046-45f6-b96a-440e596c8088"). InnerVolumeSpecName "kube-api-access-zvnff". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.077107 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f183bc38-e046-45f6-b96a-440e596c8088-inventory" (OuterVolumeSpecName: "inventory") pod "f183bc38-e046-45f6-b96a-440e596c8088" (UID: "f183bc38-e046-45f6-b96a-440e596c8088"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.077768 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f183bc38-e046-45f6-b96a-440e596c8088-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f183bc38-e046-45f6-b96a-440e596c8088" (UID: "f183bc38-e046-45f6-b96a-440e596c8088"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.145432 4734 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f183bc38-e046-45f6-b96a-440e596c8088-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.145499 4734 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f183bc38-e046-45f6-b96a-440e596c8088-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.145513 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvnff\" (UniqueName: \"kubernetes.io/projected/f183bc38-e046-45f6-b96a-440e596c8088-kube-api-access-zvnff\") on node \"crc\" DevicePath \"\"" Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.408207 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq" event={"ID":"f183bc38-e046-45f6-b96a-440e596c8088","Type":"ContainerDied","Data":"bd5194a69803ad52ad45e7919f82a140541046f17e2d3e52847852bf27876a66"} Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.408273 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd5194a69803ad52ad45e7919f82a140541046f17e2d3e52847852bf27876a66" Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.408429 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq" Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.524619 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45qzx"] Dec 05 23:50:35 crc kubenswrapper[4734]: E1205 23:50:35.525147 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f183bc38-e046-45f6-b96a-440e596c8088" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.525180 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="f183bc38-e046-45f6-b96a-440e596c8088" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.525415 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="f183bc38-e046-45f6-b96a-440e596c8088" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.526321 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45qzx" Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.529656 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.531320 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.531726 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.541029 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gsdqx" Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.543097 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45qzx"] Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.583330 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc15ec12-e046-4933-beec-886e0868c644-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-45qzx\" (UID: \"cc15ec12-e046-4933-beec-886e0868c644\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45qzx" Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.583442 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc15ec12-e046-4933-beec-886e0868c644-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-45qzx\" (UID: \"cc15ec12-e046-4933-beec-886e0868c644\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45qzx" Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.583621 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5kkk\" (UniqueName: \"kubernetes.io/projected/cc15ec12-e046-4933-beec-886e0868c644-kube-api-access-z5kkk\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-45qzx\" (UID: \"cc15ec12-e046-4933-beec-886e0868c644\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45qzx" Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.614625 4734 scope.go:117] "RemoveContainer" containerID="bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec" Dec 05 23:50:35 crc kubenswrapper[4734]: E1205 23:50:35.615046 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.685587 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5kkk\" (UniqueName: \"kubernetes.io/projected/cc15ec12-e046-4933-beec-886e0868c644-kube-api-access-z5kkk\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-45qzx\" (UID: \"cc15ec12-e046-4933-beec-886e0868c644\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45qzx" Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.685829 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc15ec12-e046-4933-beec-886e0868c644-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-45qzx\" (UID: \"cc15ec12-e046-4933-beec-886e0868c644\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45qzx" Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.685929 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc15ec12-e046-4933-beec-886e0868c644-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-45qzx\" (UID: \"cc15ec12-e046-4933-beec-886e0868c644\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45qzx" Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.695640 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc15ec12-e046-4933-beec-886e0868c644-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-45qzx\" (UID: \"cc15ec12-e046-4933-beec-886e0868c644\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45qzx" Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.704620 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc15ec12-e046-4933-beec-886e0868c644-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-45qzx\" (UID: \"cc15ec12-e046-4933-beec-886e0868c644\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45qzx" Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.705086 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5kkk\" (UniqueName: \"kubernetes.io/projected/cc15ec12-e046-4933-beec-886e0868c644-kube-api-access-z5kkk\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-45qzx\" (UID: \"cc15ec12-e046-4933-beec-886e0868c644\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45qzx" Dec 05 23:50:35 crc kubenswrapper[4734]: I1205 23:50:35.889927 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45qzx" Dec 05 23:50:36 crc kubenswrapper[4734]: I1205 23:50:36.449371 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45qzx"] Dec 05 23:50:37 crc kubenswrapper[4734]: I1205 23:50:37.432108 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45qzx" event={"ID":"cc15ec12-e046-4933-beec-886e0868c644","Type":"ContainerStarted","Data":"4275dd8e45399efa48c8bf76853926240cce1dba1424836e8f51fce19da72257"} Dec 05 23:50:37 crc kubenswrapper[4734]: I1205 23:50:37.432946 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45qzx" event={"ID":"cc15ec12-e046-4933-beec-886e0868c644","Type":"ContainerStarted","Data":"1d84741f64a2316ba864fd81065a4fe74f0087916f10236b9017ceee4b8de440"} Dec 05 23:50:37 crc kubenswrapper[4734]: I1205 23:50:37.471613 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45qzx" podStartSLOduration=2.046977179 podStartE2EDuration="2.471583789s" podCreationTimestamp="2025-12-05 23:50:35 +0000 UTC" firstStartedPulling="2025-12-05 23:50:36.445854453 +0000 UTC m=+1857.129258769" lastFinishedPulling="2025-12-05 23:50:36.870461103 +0000 UTC m=+1857.553865379" observedRunningTime="2025-12-05 23:50:37.464581748 +0000 UTC m=+1858.147986024" watchObservedRunningTime="2025-12-05 23:50:37.471583789 +0000 UTC m=+1858.154988065" Dec 05 23:50:43 crc kubenswrapper[4734]: I1205 23:50:43.500823 4734 generic.go:334] "Generic (PLEG): container finished" podID="cc15ec12-e046-4933-beec-886e0868c644" containerID="4275dd8e45399efa48c8bf76853926240cce1dba1424836e8f51fce19da72257" exitCode=0 Dec 05 23:50:43 crc kubenswrapper[4734]: I1205 23:50:43.500916 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45qzx" event={"ID":"cc15ec12-e046-4933-beec-886e0868c644","Type":"ContainerDied","Data":"4275dd8e45399efa48c8bf76853926240cce1dba1424836e8f51fce19da72257"} Dec 05 23:50:44 crc kubenswrapper[4734]: I1205 23:50:44.958959 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45qzx" Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.033401 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5kkk\" (UniqueName: \"kubernetes.io/projected/cc15ec12-e046-4933-beec-886e0868c644-kube-api-access-z5kkk\") pod \"cc15ec12-e046-4933-beec-886e0868c644\" (UID: \"cc15ec12-e046-4933-beec-886e0868c644\") " Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.033695 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc15ec12-e046-4933-beec-886e0868c644-inventory\") pod \"cc15ec12-e046-4933-beec-886e0868c644\" (UID: \"cc15ec12-e046-4933-beec-886e0868c644\") " Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.033734 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc15ec12-e046-4933-beec-886e0868c644-ssh-key\") pod \"cc15ec12-e046-4933-beec-886e0868c644\" (UID: \"cc15ec12-e046-4933-beec-886e0868c644\") " Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.040246 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc15ec12-e046-4933-beec-886e0868c644-kube-api-access-z5kkk" (OuterVolumeSpecName: "kube-api-access-z5kkk") pod "cc15ec12-e046-4933-beec-886e0868c644" (UID: "cc15ec12-e046-4933-beec-886e0868c644"). InnerVolumeSpecName "kube-api-access-z5kkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.064551 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc15ec12-e046-4933-beec-886e0868c644-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cc15ec12-e046-4933-beec-886e0868c644" (UID: "cc15ec12-e046-4933-beec-886e0868c644"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.065560 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc15ec12-e046-4933-beec-886e0868c644-inventory" (OuterVolumeSpecName: "inventory") pod "cc15ec12-e046-4933-beec-886e0868c644" (UID: "cc15ec12-e046-4933-beec-886e0868c644"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.138937 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5kkk\" (UniqueName: \"kubernetes.io/projected/cc15ec12-e046-4933-beec-886e0868c644-kube-api-access-z5kkk\") on node \"crc\" DevicePath \"\"" Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.138996 4734 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc15ec12-e046-4933-beec-886e0868c644-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.139006 4734 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cc15ec12-e046-4933-beec-886e0868c644-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.525152 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45qzx" event={"ID":"cc15ec12-e046-4933-beec-886e0868c644","Type":"ContainerDied","Data":"1d84741f64a2316ba864fd81065a4fe74f0087916f10236b9017ceee4b8de440"} Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.525601 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d84741f64a2316ba864fd81065a4fe74f0087916f10236b9017ceee4b8de440" Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.525255 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45qzx" Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.613247 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-lqrnf"] Dec 05 23:50:45 crc kubenswrapper[4734]: E1205 23:50:45.613842 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc15ec12-e046-4933-beec-886e0868c644" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.613871 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc15ec12-e046-4933-beec-886e0868c644" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.614170 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc15ec12-e046-4933-beec-886e0868c644" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.616999 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lqrnf" Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.630851 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.630989 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.632903 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.635158 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gsdqx" Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.653413 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpxdh\" (UniqueName: \"kubernetes.io/projected/43caeb9a-1d22-41be-abb1-48b4881e6afb-kube-api-access-kpxdh\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lqrnf\" (UID: \"43caeb9a-1d22-41be-abb1-48b4881e6afb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lqrnf" Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.653484 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43caeb9a-1d22-41be-abb1-48b4881e6afb-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lqrnf\" (UID: \"43caeb9a-1d22-41be-abb1-48b4881e6afb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lqrnf" Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.653584 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43caeb9a-1d22-41be-abb1-48b4881e6afb-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lqrnf\" (UID: \"43caeb9a-1d22-41be-abb1-48b4881e6afb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lqrnf" Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.674068 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-lqrnf"] Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.756301 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpxdh\" (UniqueName: \"kubernetes.io/projected/43caeb9a-1d22-41be-abb1-48b4881e6afb-kube-api-access-kpxdh\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lqrnf\" (UID: \"43caeb9a-1d22-41be-abb1-48b4881e6afb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lqrnf" Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.756384 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43caeb9a-1d22-41be-abb1-48b4881e6afb-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lqrnf\" (UID: \"43caeb9a-1d22-41be-abb1-48b4881e6afb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lqrnf" Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.756457 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43caeb9a-1d22-41be-abb1-48b4881e6afb-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lqrnf\" (UID: \"43caeb9a-1d22-41be-abb1-48b4881e6afb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lqrnf" Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.762691 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43caeb9a-1d22-41be-abb1-48b4881e6afb-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lqrnf\" (UID: \"43caeb9a-1d22-41be-abb1-48b4881e6afb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lqrnf" Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.762759 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43caeb9a-1d22-41be-abb1-48b4881e6afb-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lqrnf\" (UID: \"43caeb9a-1d22-41be-abb1-48b4881e6afb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lqrnf" Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.783338 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpxdh\" (UniqueName: \"kubernetes.io/projected/43caeb9a-1d22-41be-abb1-48b4881e6afb-kube-api-access-kpxdh\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lqrnf\" (UID: \"43caeb9a-1d22-41be-abb1-48b4881e6afb\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lqrnf" Dec 05 23:50:45 crc kubenswrapper[4734]: I1205 23:50:45.955511 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lqrnf" Dec 05 23:50:46 crc kubenswrapper[4734]: I1205 23:50:46.343060 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-lqrnf"] Dec 05 23:50:46 crc kubenswrapper[4734]: I1205 23:50:46.535021 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lqrnf" event={"ID":"43caeb9a-1d22-41be-abb1-48b4881e6afb","Type":"ContainerStarted","Data":"7e22a9219ab27dfb957cb25bf8d197e2896a2df0fa2e057c2284b839d76aa58f"} Dec 05 23:50:47 crc kubenswrapper[4734]: I1205 23:50:47.045136 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jcths"] Dec 05 23:50:47 crc kubenswrapper[4734]: I1205 23:50:47.058151 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jcths"] Dec 05 23:50:47 crc kubenswrapper[4734]: I1205 23:50:47.549241 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lqrnf" event={"ID":"43caeb9a-1d22-41be-abb1-48b4881e6afb","Type":"ContainerStarted","Data":"62e98402efac70169a731549f248565e635ee70a3f04aea321ca7404f7bf1099"} Dec 05 23:50:47 crc kubenswrapper[4734]: I1205 23:50:47.592248 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lqrnf" podStartSLOduration=2.135220383 podStartE2EDuration="2.59221953s" podCreationTimestamp="2025-12-05 23:50:45 +0000 UTC" firstStartedPulling="2025-12-05 23:50:46.352400343 +0000 UTC m=+1867.035804619" lastFinishedPulling="2025-12-05 23:50:46.8093995 +0000 UTC m=+1867.492803766" observedRunningTime="2025-12-05 23:50:47.581851538 +0000 UTC m=+1868.265255814" watchObservedRunningTime="2025-12-05 23:50:47.59221953 +0000 UTC m=+1868.275623796" Dec 05 23:50:47 crc kubenswrapper[4734]: I1205 23:50:47.628098 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="017da56d-32a5-42b2-91c5-efc5fc6480c3" path="/var/lib/kubelet/pods/017da56d-32a5-42b2-91c5-efc5fc6480c3/volumes" Dec 05 23:50:50 crc kubenswrapper[4734]: I1205 23:50:50.614459 4734 scope.go:117] "RemoveContainer" containerID="bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec" Dec 05 23:50:50 crc kubenswrapper[4734]: E1205 23:50:50.615209 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:50:51 crc kubenswrapper[4734]: I1205 23:50:51.354598 4734 scope.go:117] "RemoveContainer" containerID="280dcb46d2d5e0e0ad693ab9fa11a2b80bd6d54132b3847713805478cdafa687" Dec 05 23:50:51 crc kubenswrapper[4734]: I1205 23:50:51.383680 4734 scope.go:117] "RemoveContainer" containerID="285fa850414a64aee6539c2bf6a5ce85580800eb5ed4b04238c6235b47f95167" Dec 05 23:50:51 crc kubenswrapper[4734]: I1205 23:50:51.452289 4734 scope.go:117] "RemoveContainer" containerID="6ef81526408489648dfd43063f64a53e2dda7f0bef3c2b9ccc2203d03038925f" Dec 05 23:50:51 crc kubenswrapper[4734]: I1205 23:50:51.504095 4734 scope.go:117] "RemoveContainer" containerID="ff2091829a05a30ccb88eb7084e7999fd774c9be43e57361cdcf75d73a9f7e1f" Dec 05 23:50:51 crc kubenswrapper[4734]: I1205 23:50:51.553746 4734 scope.go:117] "RemoveContainer" containerID="be080adf0c7e472e9d3f2c20e97eab322e3127ee9529e220c989ef5ee124359d" Dec 05 23:50:51 crc kubenswrapper[4734]: I1205 23:50:51.652565 4734 scope.go:117] "RemoveContainer" containerID="125cdc92a0a6a72d59baae612885bbd611cd97fad90627d46778a3cec2076ab4" Dec 05 23:50:51 crc kubenswrapper[4734]: I1205 23:50:51.676488 4734 scope.go:117] "RemoveContainer" containerID="290b0639851f62b73a9b551489b2e7b6df3784c8b0c0f231090f1cf4238d4feb" Dec 05 23:51:04 crc kubenswrapper[4734]: I1205 23:51:04.615447 4734 scope.go:117] "RemoveContainer" containerID="bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec" Dec 05 23:51:04 crc kubenswrapper[4734]: E1205 23:51:04.616629 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:51:12 crc kubenswrapper[4734]: I1205 23:51:12.044482 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tr7x7"] Dec 05 23:51:12 crc kubenswrapper[4734]: I1205 23:51:12.057060 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tr7x7"] Dec 05 23:51:13 crc kubenswrapper[4734]: I1205 23:51:13.033376 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-mbpqk"] Dec 05 23:51:13 crc kubenswrapper[4734]: I1205 23:51:13.044489 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-mbpqk"] Dec 05 23:51:13 crc kubenswrapper[4734]: I1205 23:51:13.627640 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3111319c-91ad-46ab-847b-5f08b2d01cb5" path="/var/lib/kubelet/pods/3111319c-91ad-46ab-847b-5f08b2d01cb5/volumes" Dec 05 23:51:13 crc kubenswrapper[4734]: I1205 23:51:13.628701 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a44e1e9c-243f-4967-ac93-72db0dd02eb0" path="/var/lib/kubelet/pods/a44e1e9c-243f-4967-ac93-72db0dd02eb0/volumes" Dec 05 23:51:19 crc kubenswrapper[4734]: I1205 23:51:19.622702 4734 scope.go:117] "RemoveContainer" containerID="bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec" Dec 05 23:51:19 crc kubenswrapper[4734]: E1205 23:51:19.623918 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:51:30 crc kubenswrapper[4734]: I1205 23:51:30.614631 4734 scope.go:117] "RemoveContainer" containerID="bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec" Dec 05 23:51:31 crc kubenswrapper[4734]: I1205 23:51:31.062849 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerStarted","Data":"62ea0ca90a403cb22c1108463dd79f495ead53f2907ab42b91e5688249314f62"} Dec 05 23:51:32 crc kubenswrapper[4734]: I1205 23:51:32.074858 4734 generic.go:334] "Generic (PLEG): container finished" podID="43caeb9a-1d22-41be-abb1-48b4881e6afb" containerID="62e98402efac70169a731549f248565e635ee70a3f04aea321ca7404f7bf1099" exitCode=0 Dec 05 23:51:32 crc kubenswrapper[4734]: I1205 23:51:32.074991 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lqrnf" event={"ID":"43caeb9a-1d22-41be-abb1-48b4881e6afb","Type":"ContainerDied","Data":"62e98402efac70169a731549f248565e635ee70a3f04aea321ca7404f7bf1099"} Dec 05 23:51:33 crc kubenswrapper[4734]: I1205 23:51:33.540646 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lqrnf" Dec 05 23:51:33 crc kubenswrapper[4734]: I1205 23:51:33.703801 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43caeb9a-1d22-41be-abb1-48b4881e6afb-ssh-key\") pod \"43caeb9a-1d22-41be-abb1-48b4881e6afb\" (UID: \"43caeb9a-1d22-41be-abb1-48b4881e6afb\") " Dec 05 23:51:33 crc kubenswrapper[4734]: I1205 23:51:33.703880 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpxdh\" (UniqueName: \"kubernetes.io/projected/43caeb9a-1d22-41be-abb1-48b4881e6afb-kube-api-access-kpxdh\") pod \"43caeb9a-1d22-41be-abb1-48b4881e6afb\" (UID: \"43caeb9a-1d22-41be-abb1-48b4881e6afb\") " Dec 05 23:51:33 crc kubenswrapper[4734]: I1205 23:51:33.704153 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43caeb9a-1d22-41be-abb1-48b4881e6afb-inventory\") pod \"43caeb9a-1d22-41be-abb1-48b4881e6afb\" (UID: \"43caeb9a-1d22-41be-abb1-48b4881e6afb\") " Dec 05 23:51:33 crc kubenswrapper[4734]: I1205 23:51:33.720031 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43caeb9a-1d22-41be-abb1-48b4881e6afb-kube-api-access-kpxdh" (OuterVolumeSpecName: "kube-api-access-kpxdh") pod "43caeb9a-1d22-41be-abb1-48b4881e6afb" (UID: "43caeb9a-1d22-41be-abb1-48b4881e6afb"). InnerVolumeSpecName "kube-api-access-kpxdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:51:33 crc kubenswrapper[4734]: I1205 23:51:33.738957 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43caeb9a-1d22-41be-abb1-48b4881e6afb-inventory" (OuterVolumeSpecName: "inventory") pod "43caeb9a-1d22-41be-abb1-48b4881e6afb" (UID: "43caeb9a-1d22-41be-abb1-48b4881e6afb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:51:33 crc kubenswrapper[4734]: I1205 23:51:33.741553 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43caeb9a-1d22-41be-abb1-48b4881e6afb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "43caeb9a-1d22-41be-abb1-48b4881e6afb" (UID: "43caeb9a-1d22-41be-abb1-48b4881e6afb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:51:33 crc kubenswrapper[4734]: I1205 23:51:33.807494 4734 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43caeb9a-1d22-41be-abb1-48b4881e6afb-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 23:51:33 crc kubenswrapper[4734]: I1205 23:51:33.808101 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpxdh\" (UniqueName: \"kubernetes.io/projected/43caeb9a-1d22-41be-abb1-48b4881e6afb-kube-api-access-kpxdh\") on node \"crc\" DevicePath \"\"" Dec 05 23:51:33 crc kubenswrapper[4734]: I1205 23:51:33.808435 4734 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43caeb9a-1d22-41be-abb1-48b4881e6afb-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 23:51:34 crc kubenswrapper[4734]: I1205 23:51:34.097784 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lqrnf" event={"ID":"43caeb9a-1d22-41be-abb1-48b4881e6afb","Type":"ContainerDied","Data":"7e22a9219ab27dfb957cb25bf8d197e2896a2df0fa2e057c2284b839d76aa58f"} Dec 05 23:51:34 crc kubenswrapper[4734]: I1205 23:51:34.098429 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e22a9219ab27dfb957cb25bf8d197e2896a2df0fa2e057c2284b839d76aa58f" Dec 05 23:51:34 crc kubenswrapper[4734]: I1205 23:51:34.098000 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lqrnf" Dec 05 23:51:34 crc kubenswrapper[4734]: I1205 23:51:34.203748 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9"] Dec 05 23:51:34 crc kubenswrapper[4734]: E1205 23:51:34.204781 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43caeb9a-1d22-41be-abb1-48b4881e6afb" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 05 23:51:34 crc kubenswrapper[4734]: I1205 23:51:34.204882 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="43caeb9a-1d22-41be-abb1-48b4881e6afb" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 05 23:51:34 crc kubenswrapper[4734]: I1205 23:51:34.205251 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="43caeb9a-1d22-41be-abb1-48b4881e6afb" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 05 23:51:34 crc kubenswrapper[4734]: I1205 23:51:34.206362 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9" Dec 05 23:51:34 crc kubenswrapper[4734]: I1205 23:51:34.211055 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gsdqx" Dec 05 23:51:34 crc kubenswrapper[4734]: I1205 23:51:34.211360 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 23:51:34 crc kubenswrapper[4734]: I1205 23:51:34.211674 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 23:51:34 crc kubenswrapper[4734]: I1205 23:51:34.211934 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 23:51:34 crc kubenswrapper[4734]: I1205 23:51:34.216860 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9"] Dec 05 23:51:34 crc kubenswrapper[4734]: I1205 23:51:34.320725 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6de30094-9f75-467b-a935-3abbdf98e94c-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9\" (UID: \"6de30094-9f75-467b-a935-3abbdf98e94c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9" Dec 05 23:51:34 crc kubenswrapper[4734]: I1205 23:51:34.321092 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mckcp\" (UniqueName: \"kubernetes.io/projected/6de30094-9f75-467b-a935-3abbdf98e94c-kube-api-access-mckcp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9\" (UID: \"6de30094-9f75-467b-a935-3abbdf98e94c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9" Dec 05 23:51:34 crc kubenswrapper[4734]: I1205 23:51:34.321614 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6de30094-9f75-467b-a935-3abbdf98e94c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9\" (UID: \"6de30094-9f75-467b-a935-3abbdf98e94c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9" Dec 05 23:51:34 crc kubenswrapper[4734]: I1205 23:51:34.423627 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6de30094-9f75-467b-a935-3abbdf98e94c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9\" (UID: \"6de30094-9f75-467b-a935-3abbdf98e94c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9" Dec 05 23:51:34 crc kubenswrapper[4734]: I1205 23:51:34.423726 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6de30094-9f75-467b-a935-3abbdf98e94c-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9\" (UID: \"6de30094-9f75-467b-a935-3abbdf98e94c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9" Dec 05 23:51:34 crc kubenswrapper[4734]: I1205 23:51:34.423819 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mckcp\" (UniqueName: \"kubernetes.io/projected/6de30094-9f75-467b-a935-3abbdf98e94c-kube-api-access-mckcp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9\" (UID: \"6de30094-9f75-467b-a935-3abbdf98e94c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9" Dec 05 23:51:34 crc kubenswrapper[4734]: I1205 23:51:34.429700 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6de30094-9f75-467b-a935-3abbdf98e94c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9\" (UID: \"6de30094-9f75-467b-a935-3abbdf98e94c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9" Dec 05 23:51:34 crc kubenswrapper[4734]: I1205 23:51:34.429806 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6de30094-9f75-467b-a935-3abbdf98e94c-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9\" (UID: \"6de30094-9f75-467b-a935-3abbdf98e94c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9" Dec 05 23:51:34 crc kubenswrapper[4734]: I1205 23:51:34.445931 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mckcp\" (UniqueName: \"kubernetes.io/projected/6de30094-9f75-467b-a935-3abbdf98e94c-kube-api-access-mckcp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9\" (UID: \"6de30094-9f75-467b-a935-3abbdf98e94c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9" Dec 05 23:51:34 crc kubenswrapper[4734]: I1205 23:51:34.577435 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9" Dec 05 23:51:35 crc kubenswrapper[4734]: I1205 23:51:35.168562 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9"] Dec 05 23:51:36 crc kubenswrapper[4734]: I1205 23:51:36.118201 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9" event={"ID":"6de30094-9f75-467b-a935-3abbdf98e94c","Type":"ContainerStarted","Data":"409a2e80cd11cb3c6042c7fd96270390ea4653a40b44aeafad6095d3130bdb7f"} Dec 05 23:51:36 crc kubenswrapper[4734]: I1205 23:51:36.118713 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9" event={"ID":"6de30094-9f75-467b-a935-3abbdf98e94c","Type":"ContainerStarted","Data":"fa68b1c4c56c8b16269d520f9b26577ab5629946a8a05510910a1fc1ef34e8aa"} Dec 05 23:51:36 crc kubenswrapper[4734]: I1205 23:51:36.145067 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9" podStartSLOduration=1.738908251 podStartE2EDuration="2.145045101s" podCreationTimestamp="2025-12-05 23:51:34 +0000 UTC" firstStartedPulling="2025-12-05 23:51:35.170616964 +0000 UTC m=+1915.854021240" lastFinishedPulling="2025-12-05 23:51:35.576753814 +0000 UTC m=+1916.260158090" observedRunningTime="2025-12-05 23:51:36.138518603 +0000 UTC m=+1916.821922889" watchObservedRunningTime="2025-12-05 23:51:36.145045101 +0000 UTC m=+1916.828449367" Dec 05 23:51:51 crc kubenswrapper[4734]: I1205 23:51:51.852420 4734 scope.go:117] "RemoveContainer" containerID="c44f55b4f5d13633a352b4bc1c0a5397ffd26bf622ddbfc60fdb7b3ad6830200" Dec 05 23:51:51 crc kubenswrapper[4734]: I1205 23:51:51.918217 4734 scope.go:117] "RemoveContainer" containerID="174ec252ff03fe6b25dab570d0b619bb332013779daf1f7f9acf87fea2e9e9d5" Dec 05 23:51:51 crc kubenswrapper[4734]: I1205 23:51:51.968761 4734 scope.go:117] "RemoveContainer" containerID="0891207bc2830d2d42212b8eedba6e39861627ec1abe38f4497aeb10177aa323" Dec 05 23:51:51 crc kubenswrapper[4734]: I1205 23:51:51.995407 4734 scope.go:117] "RemoveContainer" containerID="6de4cf91cda98d53addaf37739c09c478234eec14caccea15d2613e9bd71fd51" Dec 05 23:51:52 crc kubenswrapper[4734]: I1205 23:51:52.045849 4734 scope.go:117] "RemoveContainer" containerID="bec55f5f18a35d43fe937b0dc9b8353b04e1352885a8fbbceec3c335eda60fd6" Dec 05 23:51:59 crc kubenswrapper[4734]: I1205 23:51:59.047951 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-pc6gs"] Dec 05 23:51:59 crc kubenswrapper[4734]: I1205 23:51:59.058272 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-pc6gs"] Dec 05 23:51:59 crc kubenswrapper[4734]: I1205 23:51:59.628210 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dcbaddd-94e5-4096-832c-a3ea35b141b8" path="/var/lib/kubelet/pods/8dcbaddd-94e5-4096-832c-a3ea35b141b8/volumes" Dec 05 23:52:30 crc kubenswrapper[4734]: I1205 23:52:30.687815 4734 generic.go:334] "Generic (PLEG): container finished" podID="6de30094-9f75-467b-a935-3abbdf98e94c" containerID="409a2e80cd11cb3c6042c7fd96270390ea4653a40b44aeafad6095d3130bdb7f" exitCode=0 Dec 05 23:52:30 crc kubenswrapper[4734]: I1205 23:52:30.687954 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9" event={"ID":"6de30094-9f75-467b-a935-3abbdf98e94c","Type":"ContainerDied","Data":"409a2e80cd11cb3c6042c7fd96270390ea4653a40b44aeafad6095d3130bdb7f"} Dec 05 23:52:32 crc kubenswrapper[4734]: I1205 23:52:32.216407 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9" Dec 05 23:52:32 crc kubenswrapper[4734]: I1205 23:52:32.333444 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6de30094-9f75-467b-a935-3abbdf98e94c-ssh-key\") pod \"6de30094-9f75-467b-a935-3abbdf98e94c\" (UID: \"6de30094-9f75-467b-a935-3abbdf98e94c\") " Dec 05 23:52:32 crc kubenswrapper[4734]: I1205 23:52:32.333711 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mckcp\" (UniqueName: \"kubernetes.io/projected/6de30094-9f75-467b-a935-3abbdf98e94c-kube-api-access-mckcp\") pod \"6de30094-9f75-467b-a935-3abbdf98e94c\" (UID: \"6de30094-9f75-467b-a935-3abbdf98e94c\") " Dec 05 23:52:32 crc kubenswrapper[4734]: I1205 23:52:32.333821 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6de30094-9f75-467b-a935-3abbdf98e94c-inventory\") pod \"6de30094-9f75-467b-a935-3abbdf98e94c\" (UID: \"6de30094-9f75-467b-a935-3abbdf98e94c\") " Dec 05 23:52:32 crc kubenswrapper[4734]: I1205 23:52:32.341985 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6de30094-9f75-467b-a935-3abbdf98e94c-kube-api-access-mckcp" (OuterVolumeSpecName: "kube-api-access-mckcp") pod "6de30094-9f75-467b-a935-3abbdf98e94c" (UID: "6de30094-9f75-467b-a935-3abbdf98e94c"). InnerVolumeSpecName "kube-api-access-mckcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:52:32 crc kubenswrapper[4734]: I1205 23:52:32.369918 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6de30094-9f75-467b-a935-3abbdf98e94c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6de30094-9f75-467b-a935-3abbdf98e94c" (UID: "6de30094-9f75-467b-a935-3abbdf98e94c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:52:32 crc kubenswrapper[4734]: I1205 23:52:32.371424 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6de30094-9f75-467b-a935-3abbdf98e94c-inventory" (OuterVolumeSpecName: "inventory") pod "6de30094-9f75-467b-a935-3abbdf98e94c" (UID: "6de30094-9f75-467b-a935-3abbdf98e94c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:52:32 crc kubenswrapper[4734]: I1205 23:52:32.438257 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mckcp\" (UniqueName: \"kubernetes.io/projected/6de30094-9f75-467b-a935-3abbdf98e94c-kube-api-access-mckcp\") on node \"crc\" DevicePath \"\"" Dec 05 23:52:32 crc kubenswrapper[4734]: I1205 23:52:32.438888 4734 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6de30094-9f75-467b-a935-3abbdf98e94c-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 23:52:32 crc kubenswrapper[4734]: I1205 23:52:32.439110 4734 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6de30094-9f75-467b-a935-3abbdf98e94c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 23:52:32 crc kubenswrapper[4734]: I1205 23:52:32.713276 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9" event={"ID":"6de30094-9f75-467b-a935-3abbdf98e94c","Type":"ContainerDied","Data":"fa68b1c4c56c8b16269d520f9b26577ab5629946a8a05510910a1fc1ef34e8aa"} Dec 05 23:52:32 crc kubenswrapper[4734]: I1205 23:52:32.713330 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa68b1c4c56c8b16269d520f9b26577ab5629946a8a05510910a1fc1ef34e8aa" Dec 05 23:52:32 crc kubenswrapper[4734]: I1205 23:52:32.713371 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9" Dec 05 23:52:32 crc kubenswrapper[4734]: I1205 23:52:32.813285 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-mtqlw"] Dec 05 23:52:32 crc kubenswrapper[4734]: E1205 23:52:32.813844 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6de30094-9f75-467b-a935-3abbdf98e94c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 05 23:52:32 crc kubenswrapper[4734]: I1205 23:52:32.813860 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="6de30094-9f75-467b-a935-3abbdf98e94c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 05 23:52:32 crc kubenswrapper[4734]: I1205 23:52:32.814108 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="6de30094-9f75-467b-a935-3abbdf98e94c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 05 23:52:32 crc kubenswrapper[4734]: I1205 23:52:32.814958 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-mtqlw" Dec 05 23:52:32 crc kubenswrapper[4734]: I1205 23:52:32.820090 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 23:52:32 crc kubenswrapper[4734]: I1205 23:52:32.820294 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 23:52:32 crc kubenswrapper[4734]: I1205 23:52:32.820382 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 23:52:32 crc kubenswrapper[4734]: I1205 23:52:32.820432 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gsdqx" Dec 05 23:52:32 crc kubenswrapper[4734]: I1205 23:52:32.826659 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-mtqlw"] Dec 05 23:52:32 crc kubenswrapper[4734]: I1205 23:52:32.954386 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d16abc61-9f6e-4980-9821-af436f2501fe-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-mtqlw\" (UID: \"d16abc61-9f6e-4980-9821-af436f2501fe\") " pod="openstack/ssh-known-hosts-edpm-deployment-mtqlw" Dec 05 23:52:32 crc kubenswrapper[4734]: I1205 23:52:32.955254 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d16abc61-9f6e-4980-9821-af436f2501fe-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-mtqlw\" (UID: \"d16abc61-9f6e-4980-9821-af436f2501fe\") " pod="openstack/ssh-known-hosts-edpm-deployment-mtqlw" Dec 05 23:52:32 crc kubenswrapper[4734]: I1205 23:52:32.955349 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbcjd\" (UniqueName: \"kubernetes.io/projected/d16abc61-9f6e-4980-9821-af436f2501fe-kube-api-access-gbcjd\") pod \"ssh-known-hosts-edpm-deployment-mtqlw\" (UID: \"d16abc61-9f6e-4980-9821-af436f2501fe\") " pod="openstack/ssh-known-hosts-edpm-deployment-mtqlw" Dec 05 23:52:33 crc kubenswrapper[4734]: I1205 23:52:33.058114 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d16abc61-9f6e-4980-9821-af436f2501fe-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-mtqlw\" (UID: \"d16abc61-9f6e-4980-9821-af436f2501fe\") " pod="openstack/ssh-known-hosts-edpm-deployment-mtqlw" Dec 05 23:52:33 crc kubenswrapper[4734]: I1205 23:52:33.058256 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d16abc61-9f6e-4980-9821-af436f2501fe-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-mtqlw\" (UID: \"d16abc61-9f6e-4980-9821-af436f2501fe\") " pod="openstack/ssh-known-hosts-edpm-deployment-mtqlw" Dec 05 23:52:33 crc kubenswrapper[4734]: I1205 23:52:33.058290 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbcjd\" (UniqueName: \"kubernetes.io/projected/d16abc61-9f6e-4980-9821-af436f2501fe-kube-api-access-gbcjd\") pod \"ssh-known-hosts-edpm-deployment-mtqlw\" (UID: \"d16abc61-9f6e-4980-9821-af436f2501fe\") " pod="openstack/ssh-known-hosts-edpm-deployment-mtqlw" Dec 05 23:52:33 crc kubenswrapper[4734]: I1205 23:52:33.067005 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d16abc61-9f6e-4980-9821-af436f2501fe-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-mtqlw\" (UID: \"d16abc61-9f6e-4980-9821-af436f2501fe\") " pod="openstack/ssh-known-hosts-edpm-deployment-mtqlw" Dec 05 23:52:33 crc kubenswrapper[4734]: I1205 23:52:33.074196 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d16abc61-9f6e-4980-9821-af436f2501fe-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-mtqlw\" (UID: \"d16abc61-9f6e-4980-9821-af436f2501fe\") " pod="openstack/ssh-known-hosts-edpm-deployment-mtqlw" Dec 05 23:52:33 crc kubenswrapper[4734]: I1205 23:52:33.079772 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbcjd\" (UniqueName: \"kubernetes.io/projected/d16abc61-9f6e-4980-9821-af436f2501fe-kube-api-access-gbcjd\") pod \"ssh-known-hosts-edpm-deployment-mtqlw\" (UID: \"d16abc61-9f6e-4980-9821-af436f2501fe\") " pod="openstack/ssh-known-hosts-edpm-deployment-mtqlw" Dec 05 23:52:33 crc kubenswrapper[4734]: I1205 23:52:33.133932 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-mtqlw" Dec 05 23:52:33 crc kubenswrapper[4734]: I1205 23:52:33.700685 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-mtqlw"] Dec 05 23:52:33 crc kubenswrapper[4734]: I1205 23:52:33.709274 4734 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 23:52:33 crc kubenswrapper[4734]: I1205 23:52:33.726479 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-mtqlw" event={"ID":"d16abc61-9f6e-4980-9821-af436f2501fe","Type":"ContainerStarted","Data":"840e9049357606261d407f45cd6b52b6500519e87fea3b4c610b4bc1afa3e685"} Dec 05 23:52:34 crc kubenswrapper[4734]: I1205 23:52:34.742219 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-mtqlw" event={"ID":"d16abc61-9f6e-4980-9821-af436f2501fe","Type":"ContainerStarted","Data":"fbe12d4553e65e62b0bdfb157fb4fa15ca143e03cff8517dbfc3770b96a064f9"} Dec 05 23:52:34 crc kubenswrapper[4734]: I1205 23:52:34.771420 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-mtqlw" podStartSLOduration=2.330931553 podStartE2EDuration="2.77139226s" podCreationTimestamp="2025-12-05 23:52:32 +0000 UTC" firstStartedPulling="2025-12-05 23:52:33.709025669 +0000 UTC m=+1974.392429945" lastFinishedPulling="2025-12-05 23:52:34.149486376 +0000 UTC m=+1974.832890652" observedRunningTime="2025-12-05 23:52:34.76400492 +0000 UTC m=+1975.447409196" watchObservedRunningTime="2025-12-05 23:52:34.77139226 +0000 UTC m=+1975.454796536" Dec 05 23:52:41 crc kubenswrapper[4734]: I1205 23:52:41.823151 4734 generic.go:334] "Generic (PLEG): container finished" podID="d16abc61-9f6e-4980-9821-af436f2501fe" containerID="fbe12d4553e65e62b0bdfb157fb4fa15ca143e03cff8517dbfc3770b96a064f9" exitCode=0 Dec 05 23:52:41 crc kubenswrapper[4734]: I1205 23:52:41.823321 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-mtqlw" event={"ID":"d16abc61-9f6e-4980-9821-af436f2501fe","Type":"ContainerDied","Data":"fbe12d4553e65e62b0bdfb157fb4fa15ca143e03cff8517dbfc3770b96a064f9"} Dec 05 23:52:43 crc kubenswrapper[4734]: I1205 23:52:43.288071 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-mtqlw" Dec 05 23:52:43 crc kubenswrapper[4734]: I1205 23:52:43.414200 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d16abc61-9f6e-4980-9821-af436f2501fe-inventory-0\") pod \"d16abc61-9f6e-4980-9821-af436f2501fe\" (UID: \"d16abc61-9f6e-4980-9821-af436f2501fe\") " Dec 05 23:52:43 crc kubenswrapper[4734]: I1205 23:52:43.414298 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbcjd\" (UniqueName: \"kubernetes.io/projected/d16abc61-9f6e-4980-9821-af436f2501fe-kube-api-access-gbcjd\") pod \"d16abc61-9f6e-4980-9821-af436f2501fe\" (UID: \"d16abc61-9f6e-4980-9821-af436f2501fe\") " Dec 05 23:52:43 crc kubenswrapper[4734]: I1205 23:52:43.414329 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d16abc61-9f6e-4980-9821-af436f2501fe-ssh-key-openstack-edpm-ipam\") pod \"d16abc61-9f6e-4980-9821-af436f2501fe\" (UID: \"d16abc61-9f6e-4980-9821-af436f2501fe\") " Dec 05 23:52:43 crc kubenswrapper[4734]: I1205 23:52:43.421269 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d16abc61-9f6e-4980-9821-af436f2501fe-kube-api-access-gbcjd" (OuterVolumeSpecName: "kube-api-access-gbcjd") pod "d16abc61-9f6e-4980-9821-af436f2501fe" (UID: "d16abc61-9f6e-4980-9821-af436f2501fe"). InnerVolumeSpecName "kube-api-access-gbcjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:52:43 crc kubenswrapper[4734]: I1205 23:52:43.445807 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d16abc61-9f6e-4980-9821-af436f2501fe-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "d16abc61-9f6e-4980-9821-af436f2501fe" (UID: "d16abc61-9f6e-4980-9821-af436f2501fe"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:52:43 crc kubenswrapper[4734]: I1205 23:52:43.446626 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d16abc61-9f6e-4980-9821-af436f2501fe-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d16abc61-9f6e-4980-9821-af436f2501fe" (UID: "d16abc61-9f6e-4980-9821-af436f2501fe"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:52:43 crc kubenswrapper[4734]: I1205 23:52:43.517789 4734 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d16abc61-9f6e-4980-9821-af436f2501fe-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 05 23:52:43 crc kubenswrapper[4734]: I1205 23:52:43.517835 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbcjd\" (UniqueName: \"kubernetes.io/projected/d16abc61-9f6e-4980-9821-af436f2501fe-kube-api-access-gbcjd\") on node \"crc\" DevicePath \"\"" Dec 05 23:52:43 crc kubenswrapper[4734]: I1205 23:52:43.517847 4734 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d16abc61-9f6e-4980-9821-af436f2501fe-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 05 23:52:43 crc kubenswrapper[4734]: I1205 23:52:43.845228 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-mtqlw" event={"ID":"d16abc61-9f6e-4980-9821-af436f2501fe","Type":"ContainerDied","Data":"840e9049357606261d407f45cd6b52b6500519e87fea3b4c610b4bc1afa3e685"} Dec 05 23:52:43 crc kubenswrapper[4734]: I1205 23:52:43.845657 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="840e9049357606261d407f45cd6b52b6500519e87fea3b4c610b4bc1afa3e685" Dec 05 23:52:43 crc kubenswrapper[4734]: I1205 23:52:43.845321 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-mtqlw" Dec 05 23:52:43 crc kubenswrapper[4734]: I1205 23:52:43.937047 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmm9q"] Dec 05 23:52:43 crc kubenswrapper[4734]: E1205 23:52:43.937741 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d16abc61-9f6e-4980-9821-af436f2501fe" containerName="ssh-known-hosts-edpm-deployment" Dec 05 23:52:43 crc kubenswrapper[4734]: I1205 23:52:43.937772 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="d16abc61-9f6e-4980-9821-af436f2501fe" containerName="ssh-known-hosts-edpm-deployment" Dec 05 23:52:43 crc kubenswrapper[4734]: I1205 23:52:43.938073 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="d16abc61-9f6e-4980-9821-af436f2501fe" containerName="ssh-known-hosts-edpm-deployment" Dec 05 23:52:43 crc kubenswrapper[4734]: I1205 23:52:43.939183 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmm9q" Dec 05 23:52:43 crc kubenswrapper[4734]: I1205 23:52:43.944125 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 23:52:43 crc kubenswrapper[4734]: I1205 23:52:43.944401 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 23:52:43 crc kubenswrapper[4734]: I1205 23:52:43.944560 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gsdqx" Dec 05 23:52:43 crc kubenswrapper[4734]: I1205 23:52:43.948510 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmm9q"] Dec 05 23:52:43 crc kubenswrapper[4734]: I1205 23:52:43.949154 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 23:52:44 crc kubenswrapper[4734]: I1205 23:52:44.028834 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/378f4ff2-7e86-40ca-b771-155a02f5cb45-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pmm9q\" (UID: \"378f4ff2-7e86-40ca-b771-155a02f5cb45\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmm9q" Dec 05 23:52:44 crc kubenswrapper[4734]: I1205 23:52:44.028903 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/378f4ff2-7e86-40ca-b771-155a02f5cb45-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pmm9q\" (UID: \"378f4ff2-7e86-40ca-b771-155a02f5cb45\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmm9q" Dec 05 23:52:44 crc kubenswrapper[4734]: I1205 23:52:44.029063 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sh9r\" (UniqueName: \"kubernetes.io/projected/378f4ff2-7e86-40ca-b771-155a02f5cb45-kube-api-access-8sh9r\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pmm9q\" (UID: \"378f4ff2-7e86-40ca-b771-155a02f5cb45\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmm9q" Dec 05 23:52:44 crc kubenswrapper[4734]: I1205 23:52:44.131384 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/378f4ff2-7e86-40ca-b771-155a02f5cb45-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pmm9q\" (UID: \"378f4ff2-7e86-40ca-b771-155a02f5cb45\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmm9q" Dec 05 23:52:44 crc kubenswrapper[4734]: I1205 23:52:44.131477 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/378f4ff2-7e86-40ca-b771-155a02f5cb45-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pmm9q\" (UID: \"378f4ff2-7e86-40ca-b771-155a02f5cb45\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmm9q" Dec 05 23:52:44 crc kubenswrapper[4734]: I1205 23:52:44.131672 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sh9r\" (UniqueName: \"kubernetes.io/projected/378f4ff2-7e86-40ca-b771-155a02f5cb45-kube-api-access-8sh9r\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pmm9q\" (UID: \"378f4ff2-7e86-40ca-b771-155a02f5cb45\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmm9q" Dec 05 23:52:44 crc kubenswrapper[4734]: I1205 23:52:44.136734 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/378f4ff2-7e86-40ca-b771-155a02f5cb45-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pmm9q\" (UID: \"378f4ff2-7e86-40ca-b771-155a02f5cb45\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmm9q" Dec 05 23:52:44 crc kubenswrapper[4734]: I1205 23:52:44.137265 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/378f4ff2-7e86-40ca-b771-155a02f5cb45-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pmm9q\" (UID: \"378f4ff2-7e86-40ca-b771-155a02f5cb45\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmm9q" Dec 05 23:52:44 crc kubenswrapper[4734]: I1205 23:52:44.149557 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sh9r\" (UniqueName: \"kubernetes.io/projected/378f4ff2-7e86-40ca-b771-155a02f5cb45-kube-api-access-8sh9r\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pmm9q\" (UID: \"378f4ff2-7e86-40ca-b771-155a02f5cb45\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmm9q" Dec 05 23:52:44 crc kubenswrapper[4734]: I1205 23:52:44.269124 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmm9q" Dec 05 23:52:44 crc kubenswrapper[4734]: I1205 23:52:44.824034 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmm9q"] Dec 05 23:52:44 crc kubenswrapper[4734]: I1205 23:52:44.859569 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmm9q" event={"ID":"378f4ff2-7e86-40ca-b771-155a02f5cb45","Type":"ContainerStarted","Data":"a550fe339482ba02dc29dd103ce9be84babef0c5c38b03e632276cc4f51f0250"} Dec 05 23:52:45 crc kubenswrapper[4734]: I1205 23:52:45.870640 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmm9q" event={"ID":"378f4ff2-7e86-40ca-b771-155a02f5cb45","Type":"ContainerStarted","Data":"856c5fbe98cf0207a9dd3eacbb6e630e5a1dd051cb2eff6ade13f1cb4cd79009"} Dec 05 23:52:45 crc kubenswrapper[4734]: I1205 23:52:45.896727 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmm9q" podStartSLOduration=2.249912108 podStartE2EDuration="2.896701318s" podCreationTimestamp="2025-12-05 23:52:43 +0000 UTC" firstStartedPulling="2025-12-05 23:52:44.831200841 +0000 UTC m=+1985.514605117" lastFinishedPulling="2025-12-05 23:52:45.477990051 +0000 UTC m=+1986.161394327" observedRunningTime="2025-12-05 23:52:45.891831119 +0000 UTC m=+1986.575235395" watchObservedRunningTime="2025-12-05 23:52:45.896701318 +0000 UTC m=+1986.580105594" Dec 05 23:52:52 crc kubenswrapper[4734]: I1205 23:52:52.141513 4734 scope.go:117] "RemoveContainer" containerID="ed2b1d754974e88c714d0e222f041a774559815332b801c5e97e6f07e8b4b396" Dec 05 23:52:54 crc kubenswrapper[4734]: I1205 23:52:54.984097 4734 generic.go:334] "Generic (PLEG): container finished" podID="378f4ff2-7e86-40ca-b771-155a02f5cb45" containerID="856c5fbe98cf0207a9dd3eacbb6e630e5a1dd051cb2eff6ade13f1cb4cd79009" exitCode=0 Dec 05 23:52:54 crc kubenswrapper[4734]: I1205 23:52:54.984188 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmm9q" event={"ID":"378f4ff2-7e86-40ca-b771-155a02f5cb45","Type":"ContainerDied","Data":"856c5fbe98cf0207a9dd3eacbb6e630e5a1dd051cb2eff6ade13f1cb4cd79009"} Dec 05 23:52:56 crc kubenswrapper[4734]: I1205 23:52:56.440303 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmm9q" Dec 05 23:52:56 crc kubenswrapper[4734]: I1205 23:52:56.614490 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/378f4ff2-7e86-40ca-b771-155a02f5cb45-inventory\") pod \"378f4ff2-7e86-40ca-b771-155a02f5cb45\" (UID: \"378f4ff2-7e86-40ca-b771-155a02f5cb45\") " Dec 05 23:52:56 crc kubenswrapper[4734]: I1205 23:52:56.615051 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sh9r\" (UniqueName: \"kubernetes.io/projected/378f4ff2-7e86-40ca-b771-155a02f5cb45-kube-api-access-8sh9r\") pod \"378f4ff2-7e86-40ca-b771-155a02f5cb45\" (UID: \"378f4ff2-7e86-40ca-b771-155a02f5cb45\") " Dec 05 23:52:56 crc kubenswrapper[4734]: I1205 23:52:56.615514 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/378f4ff2-7e86-40ca-b771-155a02f5cb45-ssh-key\") pod \"378f4ff2-7e86-40ca-b771-155a02f5cb45\" (UID: \"378f4ff2-7e86-40ca-b771-155a02f5cb45\") " Dec 05 23:52:56 crc kubenswrapper[4734]: I1205 23:52:56.624947 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/378f4ff2-7e86-40ca-b771-155a02f5cb45-kube-api-access-8sh9r" (OuterVolumeSpecName: "kube-api-access-8sh9r") pod "378f4ff2-7e86-40ca-b771-155a02f5cb45" (UID: "378f4ff2-7e86-40ca-b771-155a02f5cb45"). InnerVolumeSpecName "kube-api-access-8sh9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:52:56 crc kubenswrapper[4734]: I1205 23:52:56.649109 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/378f4ff2-7e86-40ca-b771-155a02f5cb45-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "378f4ff2-7e86-40ca-b771-155a02f5cb45" (UID: "378f4ff2-7e86-40ca-b771-155a02f5cb45"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:52:56 crc kubenswrapper[4734]: I1205 23:52:56.672000 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/378f4ff2-7e86-40ca-b771-155a02f5cb45-inventory" (OuterVolumeSpecName: "inventory") pod "378f4ff2-7e86-40ca-b771-155a02f5cb45" (UID: "378f4ff2-7e86-40ca-b771-155a02f5cb45"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:52:56 crc kubenswrapper[4734]: I1205 23:52:56.718908 4734 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/378f4ff2-7e86-40ca-b771-155a02f5cb45-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 23:52:56 crc kubenswrapper[4734]: I1205 23:52:56.718965 4734 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/378f4ff2-7e86-40ca-b771-155a02f5cb45-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 23:52:56 crc kubenswrapper[4734]: I1205 23:52:56.718987 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sh9r\" (UniqueName: \"kubernetes.io/projected/378f4ff2-7e86-40ca-b771-155a02f5cb45-kube-api-access-8sh9r\") on node \"crc\" DevicePath \"\"" Dec 05 23:52:57 crc kubenswrapper[4734]: I1205 23:52:57.007692 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmm9q" event={"ID":"378f4ff2-7e86-40ca-b771-155a02f5cb45","Type":"ContainerDied","Data":"a550fe339482ba02dc29dd103ce9be84babef0c5c38b03e632276cc4f51f0250"} Dec 05 23:52:57 crc kubenswrapper[4734]: I1205 23:52:57.007745 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a550fe339482ba02dc29dd103ce9be84babef0c5c38b03e632276cc4f51f0250" Dec 05 23:52:57 crc kubenswrapper[4734]: I1205 23:52:57.007747 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmm9q" Dec 05 23:52:57 crc kubenswrapper[4734]: I1205 23:52:57.108394 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w"] Dec 05 23:52:57 crc kubenswrapper[4734]: E1205 23:52:57.109084 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="378f4ff2-7e86-40ca-b771-155a02f5cb45" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 05 23:52:57 crc kubenswrapper[4734]: I1205 23:52:57.109115 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="378f4ff2-7e86-40ca-b771-155a02f5cb45" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 05 23:52:57 crc kubenswrapper[4734]: I1205 23:52:57.109425 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="378f4ff2-7e86-40ca-b771-155a02f5cb45" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 05 23:52:57 crc kubenswrapper[4734]: I1205 23:52:57.110569 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w" Dec 05 23:52:57 crc kubenswrapper[4734]: I1205 23:52:57.116779 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 23:52:57 crc kubenswrapper[4734]: I1205 23:52:57.116908 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gsdqx" Dec 05 23:52:57 crc kubenswrapper[4734]: I1205 23:52:57.117100 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 23:52:57 crc kubenswrapper[4734]: I1205 23:52:57.121945 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 23:52:57 crc kubenswrapper[4734]: I1205 23:52:57.124290 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w"] Dec 05 23:52:57 crc kubenswrapper[4734]: I1205 23:52:57.127826 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6srq6\" (UniqueName: \"kubernetes.io/projected/b9d39a80-01a8-421a-afac-94171314c0e1-kube-api-access-6srq6\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w\" (UID: \"b9d39a80-01a8-421a-afac-94171314c0e1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w" Dec 05 23:52:57 crc kubenswrapper[4734]: I1205 23:52:57.127924 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b9d39a80-01a8-421a-afac-94171314c0e1-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w\" (UID: \"b9d39a80-01a8-421a-afac-94171314c0e1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w" Dec 05 23:52:57 crc kubenswrapper[4734]: I1205 23:52:57.128193 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9d39a80-01a8-421a-afac-94171314c0e1-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w\" (UID: \"b9d39a80-01a8-421a-afac-94171314c0e1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w" Dec 05 23:52:57 crc kubenswrapper[4734]: I1205 23:52:57.230329 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6srq6\" (UniqueName: \"kubernetes.io/projected/b9d39a80-01a8-421a-afac-94171314c0e1-kube-api-access-6srq6\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w\" (UID: \"b9d39a80-01a8-421a-afac-94171314c0e1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w" Dec 05 23:52:57 crc kubenswrapper[4734]: I1205 23:52:57.230816 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b9d39a80-01a8-421a-afac-94171314c0e1-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w\" (UID: \"b9d39a80-01a8-421a-afac-94171314c0e1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w" Dec 05 23:52:57 crc kubenswrapper[4734]: I1205 23:52:57.231142 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9d39a80-01a8-421a-afac-94171314c0e1-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w\" (UID: \"b9d39a80-01a8-421a-afac-94171314c0e1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w" Dec 05 23:52:57 crc kubenswrapper[4734]: I1205 23:52:57.235943 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9d39a80-01a8-421a-afac-94171314c0e1-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w\" (UID: \"b9d39a80-01a8-421a-afac-94171314c0e1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w" Dec 05 23:52:57 crc kubenswrapper[4734]: I1205 23:52:57.241570 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b9d39a80-01a8-421a-afac-94171314c0e1-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w\" (UID: \"b9d39a80-01a8-421a-afac-94171314c0e1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w" Dec 05 23:52:57 crc kubenswrapper[4734]: I1205 23:52:57.248792 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6srq6\" (UniqueName: \"kubernetes.io/projected/b9d39a80-01a8-421a-afac-94171314c0e1-kube-api-access-6srq6\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w\" (UID: \"b9d39a80-01a8-421a-afac-94171314c0e1\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w" Dec 05 23:52:57 crc kubenswrapper[4734]: I1205 23:52:57.432260 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w" Dec 05 23:52:58 crc kubenswrapper[4734]: I1205 23:52:58.023396 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w"] Dec 05 23:52:59 crc kubenswrapper[4734]: I1205 23:52:59.032237 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w" event={"ID":"b9d39a80-01a8-421a-afac-94171314c0e1","Type":"ContainerStarted","Data":"51d761519777c10f47512a73a3e465ce86faef2a5d332abe4a636a8768ee13a5"} Dec 05 23:52:59 crc kubenswrapper[4734]: I1205 23:52:59.032771 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w" event={"ID":"b9d39a80-01a8-421a-afac-94171314c0e1","Type":"ContainerStarted","Data":"625f8ff2813ecef231d45e98c36d2ade52ed271f3e57f081a7542e36a02fb1eb"} Dec 05 23:52:59 crc kubenswrapper[4734]: I1205 23:52:59.067494 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w" podStartSLOduration=1.661670502 podStartE2EDuration="2.067460904s" podCreationTimestamp="2025-12-05 23:52:57 +0000 UTC" firstStartedPulling="2025-12-05 23:52:58.03163304 +0000 UTC m=+1998.715037316" lastFinishedPulling="2025-12-05 23:52:58.437423442 +0000 UTC m=+1999.120827718" observedRunningTime="2025-12-05 23:52:59.056958508 +0000 UTC m=+1999.740362784" watchObservedRunningTime="2025-12-05 23:52:59.067460904 +0000 UTC m=+1999.750865180" Dec 05 23:53:09 crc kubenswrapper[4734]: I1205 23:53:09.141386 4734 generic.go:334] "Generic (PLEG): container finished" podID="b9d39a80-01a8-421a-afac-94171314c0e1" containerID="51d761519777c10f47512a73a3e465ce86faef2a5d332abe4a636a8768ee13a5" exitCode=0 Dec 05 23:53:09 crc kubenswrapper[4734]: I1205 23:53:09.141491 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w" event={"ID":"b9d39a80-01a8-421a-afac-94171314c0e1","Type":"ContainerDied","Data":"51d761519777c10f47512a73a3e465ce86faef2a5d332abe4a636a8768ee13a5"} Dec 05 23:53:10 crc kubenswrapper[4734]: I1205 23:53:10.586287 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w" Dec 05 23:53:10 crc kubenswrapper[4734]: I1205 23:53:10.656305 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b9d39a80-01a8-421a-afac-94171314c0e1-ssh-key\") pod \"b9d39a80-01a8-421a-afac-94171314c0e1\" (UID: \"b9d39a80-01a8-421a-afac-94171314c0e1\") " Dec 05 23:53:10 crc kubenswrapper[4734]: I1205 23:53:10.656432 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9d39a80-01a8-421a-afac-94171314c0e1-inventory\") pod \"b9d39a80-01a8-421a-afac-94171314c0e1\" (UID: \"b9d39a80-01a8-421a-afac-94171314c0e1\") " Dec 05 23:53:10 crc kubenswrapper[4734]: I1205 23:53:10.656507 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6srq6\" (UniqueName: \"kubernetes.io/projected/b9d39a80-01a8-421a-afac-94171314c0e1-kube-api-access-6srq6\") pod \"b9d39a80-01a8-421a-afac-94171314c0e1\" (UID: \"b9d39a80-01a8-421a-afac-94171314c0e1\") " Dec 05 23:53:10 crc kubenswrapper[4734]: I1205 23:53:10.663071 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9d39a80-01a8-421a-afac-94171314c0e1-kube-api-access-6srq6" (OuterVolumeSpecName: "kube-api-access-6srq6") pod "b9d39a80-01a8-421a-afac-94171314c0e1" (UID: "b9d39a80-01a8-421a-afac-94171314c0e1"). InnerVolumeSpecName "kube-api-access-6srq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:53:10 crc kubenswrapper[4734]: I1205 23:53:10.687302 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9d39a80-01a8-421a-afac-94171314c0e1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b9d39a80-01a8-421a-afac-94171314c0e1" (UID: "b9d39a80-01a8-421a-afac-94171314c0e1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:53:10 crc kubenswrapper[4734]: I1205 23:53:10.689081 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9d39a80-01a8-421a-afac-94171314c0e1-inventory" (OuterVolumeSpecName: "inventory") pod "b9d39a80-01a8-421a-afac-94171314c0e1" (UID: "b9d39a80-01a8-421a-afac-94171314c0e1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:53:10 crc kubenswrapper[4734]: I1205 23:53:10.761791 4734 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b9d39a80-01a8-421a-afac-94171314c0e1-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 23:53:10 crc kubenswrapper[4734]: I1205 23:53:10.761868 4734 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9d39a80-01a8-421a-afac-94171314c0e1-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 23:53:10 crc kubenswrapper[4734]: I1205 23:53:10.761886 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6srq6\" (UniqueName: \"kubernetes.io/projected/b9d39a80-01a8-421a-afac-94171314c0e1-kube-api-access-6srq6\") on node \"crc\" DevicePath \"\"" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.164930 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w" event={"ID":"b9d39a80-01a8-421a-afac-94171314c0e1","Type":"ContainerDied","Data":"625f8ff2813ecef231d45e98c36d2ade52ed271f3e57f081a7542e36a02fb1eb"} Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.165783 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="625f8ff2813ecef231d45e98c36d2ade52ed271f3e57f081a7542e36a02fb1eb" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.164980 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.409369 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv"] Dec 05 23:53:11 crc kubenswrapper[4734]: E1205 23:53:11.410047 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9d39a80-01a8-421a-afac-94171314c0e1" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.410077 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9d39a80-01a8-421a-afac-94171314c0e1" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.410410 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9d39a80-01a8-421a-afac-94171314c0e1" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.411415 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.417635 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv"] Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.422779 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gsdqx" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.423076 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.423764 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.423919 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.424039 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.424184 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.424344 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.424548 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.487980 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45shx\" (UniqueName: \"kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-kube-api-access-45shx\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.488094 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.488131 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.488189 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.488245 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.488418 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.488573 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.488626 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.488706 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.488830 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.488970 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.489121 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.489210 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.489282 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.591779 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45shx\" (UniqueName: \"kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-kube-api-access-45shx\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.591856 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.591887 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.591914 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.591954 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.591985 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.592030 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.592059 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.592102 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.592144 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.592182 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.592234 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.592276 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.592312 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.598132 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.598935 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.599407 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.599494 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.599578 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.599676 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.600007 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.601307 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.602856 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.604204 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.607400 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.608888 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.610170 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.613030 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45shx\" (UniqueName: \"kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-kube-api-access-45shx\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:11 crc kubenswrapper[4734]: I1205 23:53:11.737716 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:12 crc kubenswrapper[4734]: I1205 23:53:12.313717 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv"] Dec 05 23:53:12 crc kubenswrapper[4734]: W1205 23:53:12.317689 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod131ed9d5_6ee3_41f4_9e7f_400cd4c0fe98.slice/crio-06ccb30123e2802b8bcc29ef4e467eea1a5c64c1a2cee0fbaf48e44b1a40603e WatchSource:0}: Error finding container 06ccb30123e2802b8bcc29ef4e467eea1a5c64c1a2cee0fbaf48e44b1a40603e: Status 404 returned error can't find the container with id 06ccb30123e2802b8bcc29ef4e467eea1a5c64c1a2cee0fbaf48e44b1a40603e Dec 05 23:53:13 crc kubenswrapper[4734]: I1205 23:53:13.185704 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" event={"ID":"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98","Type":"ContainerStarted","Data":"c45e10f85fa2d9d612ab523bce0b6eb32a88bc6c6493c61174f9c796386f7df6"} Dec 05 23:53:13 crc kubenswrapper[4734]: I1205 23:53:13.186453 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" event={"ID":"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98","Type":"ContainerStarted","Data":"06ccb30123e2802b8bcc29ef4e467eea1a5c64c1a2cee0fbaf48e44b1a40603e"} Dec 05 23:53:27 crc kubenswrapper[4734]: I1205 23:53:27.560545 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" podStartSLOduration=16.152976705 podStartE2EDuration="16.560490323s" podCreationTimestamp="2025-12-05 23:53:11 +0000 UTC" firstStartedPulling="2025-12-05 23:53:12.320785838 +0000 UTC m=+2013.004190114" lastFinishedPulling="2025-12-05 23:53:12.728299456 +0000 UTC m=+2013.411703732" observedRunningTime="2025-12-05 23:53:13.218324314 +0000 UTC m=+2013.901728590" watchObservedRunningTime="2025-12-05 23:53:27.560490323 +0000 UTC m=+2028.243894599" Dec 05 23:53:27 crc kubenswrapper[4734]: I1205 23:53:27.570504 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zc6n4"] Dec 05 23:53:27 crc kubenswrapper[4734]: I1205 23:53:27.575359 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zc6n4" Dec 05 23:53:27 crc kubenswrapper[4734]: I1205 23:53:27.584857 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zc6n4"] Dec 05 23:53:27 crc kubenswrapper[4734]: I1205 23:53:27.601294 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dlnc\" (UniqueName: \"kubernetes.io/projected/5a2af09b-1940-49e1-ba7d-725592ed000f-kube-api-access-7dlnc\") pod \"redhat-operators-zc6n4\" (UID: \"5a2af09b-1940-49e1-ba7d-725592ed000f\") " pod="openshift-marketplace/redhat-operators-zc6n4" Dec 05 23:53:27 crc kubenswrapper[4734]: I1205 23:53:27.601416 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a2af09b-1940-49e1-ba7d-725592ed000f-catalog-content\") pod \"redhat-operators-zc6n4\" (UID: \"5a2af09b-1940-49e1-ba7d-725592ed000f\") " pod="openshift-marketplace/redhat-operators-zc6n4" Dec 05 23:53:27 crc kubenswrapper[4734]: I1205 23:53:27.601492 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a2af09b-1940-49e1-ba7d-725592ed000f-utilities\") pod \"redhat-operators-zc6n4\" (UID: \"5a2af09b-1940-49e1-ba7d-725592ed000f\") " pod="openshift-marketplace/redhat-operators-zc6n4" Dec 05 23:53:27 crc kubenswrapper[4734]: I1205 23:53:27.703874 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a2af09b-1940-49e1-ba7d-725592ed000f-catalog-content\") pod \"redhat-operators-zc6n4\" (UID: \"5a2af09b-1940-49e1-ba7d-725592ed000f\") " pod="openshift-marketplace/redhat-operators-zc6n4" Dec 05 23:53:27 crc kubenswrapper[4734]: I1205 23:53:27.704069 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a2af09b-1940-49e1-ba7d-725592ed000f-utilities\") pod \"redhat-operators-zc6n4\" (UID: \"5a2af09b-1940-49e1-ba7d-725592ed000f\") " pod="openshift-marketplace/redhat-operators-zc6n4" Dec 05 23:53:27 crc kubenswrapper[4734]: I1205 23:53:27.704116 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dlnc\" (UniqueName: \"kubernetes.io/projected/5a2af09b-1940-49e1-ba7d-725592ed000f-kube-api-access-7dlnc\") pod \"redhat-operators-zc6n4\" (UID: \"5a2af09b-1940-49e1-ba7d-725592ed000f\") " pod="openshift-marketplace/redhat-operators-zc6n4" Dec 05 23:53:27 crc kubenswrapper[4734]: I1205 23:53:27.704663 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a2af09b-1940-49e1-ba7d-725592ed000f-catalog-content\") pod \"redhat-operators-zc6n4\" (UID: \"5a2af09b-1940-49e1-ba7d-725592ed000f\") " pod="openshift-marketplace/redhat-operators-zc6n4" Dec 05 23:53:27 crc kubenswrapper[4734]: I1205 23:53:27.705648 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a2af09b-1940-49e1-ba7d-725592ed000f-utilities\") pod \"redhat-operators-zc6n4\" (UID: \"5a2af09b-1940-49e1-ba7d-725592ed000f\") " pod="openshift-marketplace/redhat-operators-zc6n4" Dec 05 23:53:27 crc kubenswrapper[4734]: I1205 23:53:27.737834 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dlnc\" (UniqueName: \"kubernetes.io/projected/5a2af09b-1940-49e1-ba7d-725592ed000f-kube-api-access-7dlnc\") pod \"redhat-operators-zc6n4\" (UID: \"5a2af09b-1940-49e1-ba7d-725592ed000f\") " pod="openshift-marketplace/redhat-operators-zc6n4" Dec 05 23:53:27 crc kubenswrapper[4734]: I1205 23:53:27.896574 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zc6n4" Dec 05 23:53:28 crc kubenswrapper[4734]: I1205 23:53:28.420952 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zc6n4"] Dec 05 23:53:29 crc kubenswrapper[4734]: I1205 23:53:29.365379 4734 generic.go:334] "Generic (PLEG): container finished" podID="5a2af09b-1940-49e1-ba7d-725592ed000f" containerID="b9b5d29cf62e041d38833595216919331f0f670b6137e28d6f761d2042eedfcb" exitCode=0 Dec 05 23:53:29 crc kubenswrapper[4734]: I1205 23:53:29.365479 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zc6n4" event={"ID":"5a2af09b-1940-49e1-ba7d-725592ed000f","Type":"ContainerDied","Data":"b9b5d29cf62e041d38833595216919331f0f670b6137e28d6f761d2042eedfcb"} Dec 05 23:53:29 crc kubenswrapper[4734]: I1205 23:53:29.365833 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zc6n4" event={"ID":"5a2af09b-1940-49e1-ba7d-725592ed000f","Type":"ContainerStarted","Data":"2c4f11fe415a8e18e2bc50ac453531e6c4b6855ac317a53846e02444c966e5ad"} Dec 05 23:53:31 crc kubenswrapper[4734]: I1205 23:53:31.387605 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zc6n4" event={"ID":"5a2af09b-1940-49e1-ba7d-725592ed000f","Type":"ContainerStarted","Data":"c6c21c0dc378010fcdec8fae2c3ff063843673b00b9477224d4648f89efdb7c9"} Dec 05 23:53:33 crc kubenswrapper[4734]: I1205 23:53:33.411085 4734 generic.go:334] "Generic (PLEG): container finished" podID="5a2af09b-1940-49e1-ba7d-725592ed000f" containerID="c6c21c0dc378010fcdec8fae2c3ff063843673b00b9477224d4648f89efdb7c9" exitCode=0 Dec 05 23:53:33 crc kubenswrapper[4734]: I1205 23:53:33.411186 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zc6n4" event={"ID":"5a2af09b-1940-49e1-ba7d-725592ed000f","Type":"ContainerDied","Data":"c6c21c0dc378010fcdec8fae2c3ff063843673b00b9477224d4648f89efdb7c9"} Dec 05 23:53:34 crc kubenswrapper[4734]: I1205 23:53:34.422988 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zc6n4" event={"ID":"5a2af09b-1940-49e1-ba7d-725592ed000f","Type":"ContainerStarted","Data":"ae8d4d45d970a6e5549a9469997070200e144d27d2f2c196efaa69e872407a3c"} Dec 05 23:53:34 crc kubenswrapper[4734]: I1205 23:53:34.448192 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zc6n4" podStartSLOduration=2.895470839 podStartE2EDuration="7.448165567s" podCreationTimestamp="2025-12-05 23:53:27 +0000 UTC" firstStartedPulling="2025-12-05 23:53:29.36763655 +0000 UTC m=+2030.051040826" lastFinishedPulling="2025-12-05 23:53:33.920331277 +0000 UTC m=+2034.603735554" observedRunningTime="2025-12-05 23:53:34.442861148 +0000 UTC m=+2035.126265474" watchObservedRunningTime="2025-12-05 23:53:34.448165567 +0000 UTC m=+2035.131569843" Dec 05 23:53:37 crc kubenswrapper[4734]: I1205 23:53:37.897565 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zc6n4" Dec 05 23:53:37 crc kubenswrapper[4734]: I1205 23:53:37.898495 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zc6n4" Dec 05 23:53:38 crc kubenswrapper[4734]: I1205 23:53:38.956050 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zc6n4" podUID="5a2af09b-1940-49e1-ba7d-725592ed000f" containerName="registry-server" probeResult="failure" output=< Dec 05 23:53:38 crc kubenswrapper[4734]: timeout: failed to connect service ":50051" within 1s Dec 05 23:53:38 crc kubenswrapper[4734]: > Dec 05 23:53:48 crc kubenswrapper[4734]: I1205 23:53:48.351627 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zc6n4" Dec 05 23:53:48 crc kubenswrapper[4734]: I1205 23:53:48.402422 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zc6n4" Dec 05 23:53:48 crc kubenswrapper[4734]: I1205 23:53:48.605391 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zc6n4"] Dec 05 23:53:49 crc kubenswrapper[4734]: I1205 23:53:49.609925 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zc6n4" podUID="5a2af09b-1940-49e1-ba7d-725592ed000f" containerName="registry-server" containerID="cri-o://ae8d4d45d970a6e5549a9469997070200e144d27d2f2c196efaa69e872407a3c" gracePeriod=2 Dec 05 23:53:50 crc kubenswrapper[4734]: I1205 23:53:50.211727 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zc6n4" Dec 05 23:53:50 crc kubenswrapper[4734]: I1205 23:53:50.335171 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a2af09b-1940-49e1-ba7d-725592ed000f-catalog-content\") pod \"5a2af09b-1940-49e1-ba7d-725592ed000f\" (UID: \"5a2af09b-1940-49e1-ba7d-725592ed000f\") " Dec 05 23:53:50 crc kubenswrapper[4734]: I1205 23:53:50.335827 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a2af09b-1940-49e1-ba7d-725592ed000f-utilities\") pod \"5a2af09b-1940-49e1-ba7d-725592ed000f\" (UID: \"5a2af09b-1940-49e1-ba7d-725592ed000f\") " Dec 05 23:53:50 crc kubenswrapper[4734]: I1205 23:53:50.336051 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dlnc\" (UniqueName: \"kubernetes.io/projected/5a2af09b-1940-49e1-ba7d-725592ed000f-kube-api-access-7dlnc\") pod \"5a2af09b-1940-49e1-ba7d-725592ed000f\" (UID: \"5a2af09b-1940-49e1-ba7d-725592ed000f\") " Dec 05 23:53:50 crc kubenswrapper[4734]: I1205 23:53:50.336718 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a2af09b-1940-49e1-ba7d-725592ed000f-utilities" (OuterVolumeSpecName: "utilities") pod "5a2af09b-1940-49e1-ba7d-725592ed000f" (UID: "5a2af09b-1940-49e1-ba7d-725592ed000f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:53:50 crc kubenswrapper[4734]: I1205 23:53:50.345839 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a2af09b-1940-49e1-ba7d-725592ed000f-kube-api-access-7dlnc" (OuterVolumeSpecName: "kube-api-access-7dlnc") pod "5a2af09b-1940-49e1-ba7d-725592ed000f" (UID: "5a2af09b-1940-49e1-ba7d-725592ed000f"). InnerVolumeSpecName "kube-api-access-7dlnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:53:50 crc kubenswrapper[4734]: I1205 23:53:50.438744 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dlnc\" (UniqueName: \"kubernetes.io/projected/5a2af09b-1940-49e1-ba7d-725592ed000f-kube-api-access-7dlnc\") on node \"crc\" DevicePath \"\"" Dec 05 23:53:50 crc kubenswrapper[4734]: I1205 23:53:50.438780 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a2af09b-1940-49e1-ba7d-725592ed000f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:53:50 crc kubenswrapper[4734]: I1205 23:53:50.445097 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:53:50 crc kubenswrapper[4734]: I1205 23:53:50.445193 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:53:50 crc kubenswrapper[4734]: I1205 23:53:50.458795 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a2af09b-1940-49e1-ba7d-725592ed000f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a2af09b-1940-49e1-ba7d-725592ed000f" (UID: "5a2af09b-1940-49e1-ba7d-725592ed000f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:53:50 crc kubenswrapper[4734]: I1205 23:53:50.540878 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a2af09b-1940-49e1-ba7d-725592ed000f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:53:50 crc kubenswrapper[4734]: I1205 23:53:50.647216 4734 generic.go:334] "Generic (PLEG): container finished" podID="5a2af09b-1940-49e1-ba7d-725592ed000f" containerID="ae8d4d45d970a6e5549a9469997070200e144d27d2f2c196efaa69e872407a3c" exitCode=0 Dec 05 23:53:50 crc kubenswrapper[4734]: I1205 23:53:50.647297 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zc6n4" event={"ID":"5a2af09b-1940-49e1-ba7d-725592ed000f","Type":"ContainerDied","Data":"ae8d4d45d970a6e5549a9469997070200e144d27d2f2c196efaa69e872407a3c"} Dec 05 23:53:50 crc kubenswrapper[4734]: I1205 23:53:50.647337 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zc6n4" event={"ID":"5a2af09b-1940-49e1-ba7d-725592ed000f","Type":"ContainerDied","Data":"2c4f11fe415a8e18e2bc50ac453531e6c4b6855ac317a53846e02444c966e5ad"} Dec 05 23:53:50 crc kubenswrapper[4734]: I1205 23:53:50.647360 4734 scope.go:117] "RemoveContainer" containerID="ae8d4d45d970a6e5549a9469997070200e144d27d2f2c196efaa69e872407a3c" Dec 05 23:53:50 crc kubenswrapper[4734]: I1205 23:53:50.647575 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zc6n4" Dec 05 23:53:50 crc kubenswrapper[4734]: I1205 23:53:50.675938 4734 scope.go:117] "RemoveContainer" containerID="c6c21c0dc378010fcdec8fae2c3ff063843673b00b9477224d4648f89efdb7c9" Dec 05 23:53:50 crc kubenswrapper[4734]: I1205 23:53:50.688824 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zc6n4"] Dec 05 23:53:50 crc kubenswrapper[4734]: I1205 23:53:50.699082 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zc6n4"] Dec 05 23:53:50 crc kubenswrapper[4734]: I1205 23:53:50.724105 4734 scope.go:117] "RemoveContainer" containerID="b9b5d29cf62e041d38833595216919331f0f670b6137e28d6f761d2042eedfcb" Dec 05 23:53:50 crc kubenswrapper[4734]: I1205 23:53:50.756870 4734 scope.go:117] "RemoveContainer" containerID="ae8d4d45d970a6e5549a9469997070200e144d27d2f2c196efaa69e872407a3c" Dec 05 23:53:50 crc kubenswrapper[4734]: E1205 23:53:50.757602 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae8d4d45d970a6e5549a9469997070200e144d27d2f2c196efaa69e872407a3c\": container with ID starting with ae8d4d45d970a6e5549a9469997070200e144d27d2f2c196efaa69e872407a3c not found: ID does not exist" containerID="ae8d4d45d970a6e5549a9469997070200e144d27d2f2c196efaa69e872407a3c" Dec 05 23:53:50 crc kubenswrapper[4734]: I1205 23:53:50.757658 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae8d4d45d970a6e5549a9469997070200e144d27d2f2c196efaa69e872407a3c"} err="failed to get container status \"ae8d4d45d970a6e5549a9469997070200e144d27d2f2c196efaa69e872407a3c\": rpc error: code = NotFound desc = could not find container \"ae8d4d45d970a6e5549a9469997070200e144d27d2f2c196efaa69e872407a3c\": container with ID starting with ae8d4d45d970a6e5549a9469997070200e144d27d2f2c196efaa69e872407a3c not found: ID does not exist" Dec 05 23:53:50 crc kubenswrapper[4734]: I1205 23:53:50.757696 4734 scope.go:117] "RemoveContainer" containerID="c6c21c0dc378010fcdec8fae2c3ff063843673b00b9477224d4648f89efdb7c9" Dec 05 23:53:50 crc kubenswrapper[4734]: E1205 23:53:50.758200 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6c21c0dc378010fcdec8fae2c3ff063843673b00b9477224d4648f89efdb7c9\": container with ID starting with c6c21c0dc378010fcdec8fae2c3ff063843673b00b9477224d4648f89efdb7c9 not found: ID does not exist" containerID="c6c21c0dc378010fcdec8fae2c3ff063843673b00b9477224d4648f89efdb7c9" Dec 05 23:53:50 crc kubenswrapper[4734]: I1205 23:53:50.758243 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6c21c0dc378010fcdec8fae2c3ff063843673b00b9477224d4648f89efdb7c9"} err="failed to get container status \"c6c21c0dc378010fcdec8fae2c3ff063843673b00b9477224d4648f89efdb7c9\": rpc error: code = NotFound desc = could not find container \"c6c21c0dc378010fcdec8fae2c3ff063843673b00b9477224d4648f89efdb7c9\": container with ID starting with c6c21c0dc378010fcdec8fae2c3ff063843673b00b9477224d4648f89efdb7c9 not found: ID does not exist" Dec 05 23:53:50 crc kubenswrapper[4734]: I1205 23:53:50.758268 4734 scope.go:117] "RemoveContainer" containerID="b9b5d29cf62e041d38833595216919331f0f670b6137e28d6f761d2042eedfcb" Dec 05 23:53:50 crc kubenswrapper[4734]: E1205 23:53:50.758767 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9b5d29cf62e041d38833595216919331f0f670b6137e28d6f761d2042eedfcb\": container with ID starting with b9b5d29cf62e041d38833595216919331f0f670b6137e28d6f761d2042eedfcb not found: ID does not exist" containerID="b9b5d29cf62e041d38833595216919331f0f670b6137e28d6f761d2042eedfcb" Dec 05 23:53:50 crc kubenswrapper[4734]: I1205 23:53:50.758801 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9b5d29cf62e041d38833595216919331f0f670b6137e28d6f761d2042eedfcb"} err="failed to get container status \"b9b5d29cf62e041d38833595216919331f0f670b6137e28d6f761d2042eedfcb\": rpc error: code = NotFound desc = could not find container \"b9b5d29cf62e041d38833595216919331f0f670b6137e28d6f761d2042eedfcb\": container with ID starting with b9b5d29cf62e041d38833595216919331f0f670b6137e28d6f761d2042eedfcb not found: ID does not exist" Dec 05 23:53:51 crc kubenswrapper[4734]: I1205 23:53:51.625355 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a2af09b-1940-49e1-ba7d-725592ed000f" path="/var/lib/kubelet/pods/5a2af09b-1940-49e1-ba7d-725592ed000f/volumes" Dec 05 23:53:54 crc kubenswrapper[4734]: I1205 23:53:54.712134 4734 generic.go:334] "Generic (PLEG): container finished" podID="131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98" containerID="c45e10f85fa2d9d612ab523bce0b6eb32a88bc6c6493c61174f9c796386f7df6" exitCode=0 Dec 05 23:53:54 crc kubenswrapper[4734]: I1205 23:53:54.712254 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" event={"ID":"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98","Type":"ContainerDied","Data":"c45e10f85fa2d9d612ab523bce0b6eb32a88bc6c6493c61174f9c796386f7df6"} Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.160289 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.282072 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-bootstrap-combined-ca-bundle\") pod \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.282656 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-libvirt-combined-ca-bundle\") pod \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.282842 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-nova-combined-ca-bundle\") pod \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.282893 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.282934 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-ovn-combined-ca-bundle\") pod \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.282964 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.283000 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-openstack-edpm-ipam-ovn-default-certs-0\") pod \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.283030 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.283058 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-neutron-metadata-combined-ca-bundle\") pod \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.284194 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-inventory\") pod \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.284232 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-repo-setup-combined-ca-bundle\") pod \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.284268 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-ssh-key\") pod \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.284299 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-telemetry-combined-ca-bundle\") pod \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.284327 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45shx\" (UniqueName: \"kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-kube-api-access-45shx\") pod \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\" (UID: \"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98\") " Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.291383 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98" (UID: "131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.293583 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98" (UID: "131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.296137 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98" (UID: "131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.296143 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98" (UID: "131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.296281 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98" (UID: "131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.296286 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-kube-api-access-45shx" (OuterVolumeSpecName: "kube-api-access-45shx") pod "131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98" (UID: "131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98"). InnerVolumeSpecName "kube-api-access-45shx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.296484 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98" (UID: "131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.296503 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98" (UID: "131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.298250 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98" (UID: "131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.298327 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98" (UID: "131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.299109 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98" (UID: "131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.302709 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98" (UID: "131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.324169 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-inventory" (OuterVolumeSpecName: "inventory") pod "131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98" (UID: "131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.331273 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98" (UID: "131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.387507 4734 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.387563 4734 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.387575 4734 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.387587 4734 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.387598 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45shx\" (UniqueName: \"kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-kube-api-access-45shx\") on node \"crc\" DevicePath \"\"" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.387609 4734 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.387622 4734 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.387631 4734 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.387640 4734 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.387650 4734 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.387659 4734 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.387670 4734 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.387681 4734 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.387691 4734 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.733672 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" event={"ID":"131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98","Type":"ContainerDied","Data":"06ccb30123e2802b8bcc29ef4e467eea1a5c64c1a2cee0fbaf48e44b1a40603e"} Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.733732 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06ccb30123e2802b8bcc29ef4e467eea1a5c64c1a2cee0fbaf48e44b1a40603e" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.733813 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.844948 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-dxg77"] Dec 05 23:53:56 crc kubenswrapper[4734]: E1205 23:53:56.845566 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a2af09b-1940-49e1-ba7d-725592ed000f" containerName="extract-content" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.845588 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a2af09b-1940-49e1-ba7d-725592ed000f" containerName="extract-content" Dec 05 23:53:56 crc kubenswrapper[4734]: E1205 23:53:56.845609 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a2af09b-1940-49e1-ba7d-725592ed000f" containerName="extract-utilities" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.845618 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a2af09b-1940-49e1-ba7d-725592ed000f" containerName="extract-utilities" Dec 05 23:53:56 crc kubenswrapper[4734]: E1205 23:53:56.845657 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.845670 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 05 23:53:56 crc kubenswrapper[4734]: E1205 23:53:56.845696 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a2af09b-1940-49e1-ba7d-725592ed000f" containerName="registry-server" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.845704 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a2af09b-1940-49e1-ba7d-725592ed000f" containerName="registry-server" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.845969 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.846016 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a2af09b-1940-49e1-ba7d-725592ed000f" containerName="registry-server" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.846921 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dxg77" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.850880 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.851041 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gsdqx" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.852592 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.853770 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.860409 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 23:53:56 crc kubenswrapper[4734]: I1205 23:53:56.881929 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-dxg77"] Dec 05 23:53:57 crc kubenswrapper[4734]: I1205 23:53:57.001200 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b772014-ade2-4ef1-9795-8a6eb255f57f-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dxg77\" (UID: \"4b772014-ade2-4ef1-9795-8a6eb255f57f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dxg77" Dec 05 23:53:57 crc kubenswrapper[4734]: I1205 23:53:57.001341 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b772014-ade2-4ef1-9795-8a6eb255f57f-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dxg77\" (UID: \"4b772014-ade2-4ef1-9795-8a6eb255f57f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dxg77" Dec 05 23:53:57 crc kubenswrapper[4734]: I1205 23:53:57.001401 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b772014-ade2-4ef1-9795-8a6eb255f57f-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dxg77\" (UID: \"4b772014-ade2-4ef1-9795-8a6eb255f57f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dxg77" Dec 05 23:53:57 crc kubenswrapper[4734]: I1205 23:53:57.001431 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4b772014-ade2-4ef1-9795-8a6eb255f57f-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dxg77\" (UID: \"4b772014-ade2-4ef1-9795-8a6eb255f57f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dxg77" Dec 05 23:53:57 crc kubenswrapper[4734]: I1205 23:53:57.001580 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcfbq\" (UniqueName: \"kubernetes.io/projected/4b772014-ade2-4ef1-9795-8a6eb255f57f-kube-api-access-mcfbq\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dxg77\" (UID: \"4b772014-ade2-4ef1-9795-8a6eb255f57f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dxg77" Dec 05 23:53:57 crc kubenswrapper[4734]: I1205 23:53:57.103869 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b772014-ade2-4ef1-9795-8a6eb255f57f-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dxg77\" (UID: \"4b772014-ade2-4ef1-9795-8a6eb255f57f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dxg77" Dec 05 23:53:57 crc kubenswrapper[4734]: I1205 23:53:57.103969 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b772014-ade2-4ef1-9795-8a6eb255f57f-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dxg77\" (UID: \"4b772014-ade2-4ef1-9795-8a6eb255f57f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dxg77" Dec 05 23:53:57 crc kubenswrapper[4734]: I1205 23:53:57.103997 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4b772014-ade2-4ef1-9795-8a6eb255f57f-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dxg77\" (UID: \"4b772014-ade2-4ef1-9795-8a6eb255f57f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dxg77" Dec 05 23:53:57 crc kubenswrapper[4734]: I1205 23:53:57.104044 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcfbq\" (UniqueName: \"kubernetes.io/projected/4b772014-ade2-4ef1-9795-8a6eb255f57f-kube-api-access-mcfbq\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dxg77\" (UID: \"4b772014-ade2-4ef1-9795-8a6eb255f57f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dxg77" Dec 05 23:53:57 crc kubenswrapper[4734]: I1205 23:53:57.104130 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b772014-ade2-4ef1-9795-8a6eb255f57f-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dxg77\" (UID: \"4b772014-ade2-4ef1-9795-8a6eb255f57f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dxg77" Dec 05 23:53:57 crc kubenswrapper[4734]: I1205 23:53:57.105250 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4b772014-ade2-4ef1-9795-8a6eb255f57f-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dxg77\" (UID: \"4b772014-ade2-4ef1-9795-8a6eb255f57f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dxg77" Dec 05 23:53:57 crc kubenswrapper[4734]: I1205 23:53:57.111709 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b772014-ade2-4ef1-9795-8a6eb255f57f-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dxg77\" (UID: \"4b772014-ade2-4ef1-9795-8a6eb255f57f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dxg77" Dec 05 23:53:57 crc kubenswrapper[4734]: I1205 23:53:57.114484 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b772014-ade2-4ef1-9795-8a6eb255f57f-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dxg77\" (UID: \"4b772014-ade2-4ef1-9795-8a6eb255f57f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dxg77" Dec 05 23:53:57 crc kubenswrapper[4734]: I1205 23:53:57.120337 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b772014-ade2-4ef1-9795-8a6eb255f57f-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dxg77\" (UID: \"4b772014-ade2-4ef1-9795-8a6eb255f57f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dxg77" Dec 05 23:53:57 crc kubenswrapper[4734]: I1205 23:53:57.130761 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcfbq\" (UniqueName: \"kubernetes.io/projected/4b772014-ade2-4ef1-9795-8a6eb255f57f-kube-api-access-mcfbq\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-dxg77\" (UID: \"4b772014-ade2-4ef1-9795-8a6eb255f57f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dxg77" Dec 05 23:53:57 crc kubenswrapper[4734]: I1205 23:53:57.166826 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dxg77" Dec 05 23:53:57 crc kubenswrapper[4734]: I1205 23:53:57.543748 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-dxg77"] Dec 05 23:53:57 crc kubenswrapper[4734]: I1205 23:53:57.744998 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dxg77" event={"ID":"4b772014-ade2-4ef1-9795-8a6eb255f57f","Type":"ContainerStarted","Data":"8933bfc324b15dbf4fe1c815523fd1bd66c155a952151f06d289ff72fd2d462e"} Dec 05 23:53:58 crc kubenswrapper[4734]: I1205 23:53:58.757872 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dxg77" event={"ID":"4b772014-ade2-4ef1-9795-8a6eb255f57f","Type":"ContainerStarted","Data":"388e55955ac6a19997b11ec4b31bd9121bc9fcb2c35c309111c71a974c2a63e2"} Dec 05 23:53:58 crc kubenswrapper[4734]: I1205 23:53:58.786133 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dxg77" podStartSLOduration=2.389632627 podStartE2EDuration="2.786108066s" podCreationTimestamp="2025-12-05 23:53:56 +0000 UTC" firstStartedPulling="2025-12-05 23:53:57.549612191 +0000 UTC m=+2058.233016467" lastFinishedPulling="2025-12-05 23:53:57.94608761 +0000 UTC m=+2058.629491906" observedRunningTime="2025-12-05 23:53:58.778334026 +0000 UTC m=+2059.461738302" watchObservedRunningTime="2025-12-05 23:53:58.786108066 +0000 UTC m=+2059.469512342" Dec 05 23:54:20 crc kubenswrapper[4734]: I1205 23:54:20.445250 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:54:20 crc kubenswrapper[4734]: I1205 23:54:20.446235 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:54:50 crc kubenswrapper[4734]: I1205 23:54:50.445034 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:54:50 crc kubenswrapper[4734]: I1205 23:54:50.445988 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:54:50 crc kubenswrapper[4734]: I1205 23:54:50.446060 4734 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 05 23:54:50 crc kubenswrapper[4734]: I1205 23:54:50.447200 4734 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"62ea0ca90a403cb22c1108463dd79f495ead53f2907ab42b91e5688249314f62"} pod="openshift-machine-config-operator/machine-config-daemon-vn94d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 23:54:50 crc kubenswrapper[4734]: I1205 23:54:50.447264 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" containerID="cri-o://62ea0ca90a403cb22c1108463dd79f495ead53f2907ab42b91e5688249314f62" gracePeriod=600 Dec 05 23:54:50 crc kubenswrapper[4734]: I1205 23:54:50.617027 4734 generic.go:334] "Generic (PLEG): container finished" podID="65758270-a7a7-46b5-af95-0588daf9fa86" containerID="62ea0ca90a403cb22c1108463dd79f495ead53f2907ab42b91e5688249314f62" exitCode=0 Dec 05 23:54:50 crc kubenswrapper[4734]: I1205 23:54:50.617113 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerDied","Data":"62ea0ca90a403cb22c1108463dd79f495ead53f2907ab42b91e5688249314f62"} Dec 05 23:54:50 crc kubenswrapper[4734]: I1205 23:54:50.617179 4734 scope.go:117] "RemoveContainer" containerID="bf2990588260a60447594f55883e9e43735892e3ca942ebe017df1d6b8641fec" Dec 05 23:54:51 crc kubenswrapper[4734]: I1205 23:54:51.716023 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerStarted","Data":"c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762"} Dec 05 23:55:07 crc kubenswrapper[4734]: I1205 23:55:07.881387 4734 generic.go:334] "Generic (PLEG): container finished" podID="4b772014-ade2-4ef1-9795-8a6eb255f57f" containerID="388e55955ac6a19997b11ec4b31bd9121bc9fcb2c35c309111c71a974c2a63e2" exitCode=0 Dec 05 23:55:07 crc kubenswrapper[4734]: I1205 23:55:07.881471 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dxg77" event={"ID":"4b772014-ade2-4ef1-9795-8a6eb255f57f","Type":"ContainerDied","Data":"388e55955ac6a19997b11ec4b31bd9121bc9fcb2c35c309111c71a974c2a63e2"} Dec 05 23:55:09 crc kubenswrapper[4734]: I1205 23:55:09.395430 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dxg77" Dec 05 23:55:09 crc kubenswrapper[4734]: I1205 23:55:09.461858 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4b772014-ade2-4ef1-9795-8a6eb255f57f-ovncontroller-config-0\") pod \"4b772014-ade2-4ef1-9795-8a6eb255f57f\" (UID: \"4b772014-ade2-4ef1-9795-8a6eb255f57f\") " Dec 05 23:55:09 crc kubenswrapper[4734]: I1205 23:55:09.462472 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b772014-ade2-4ef1-9795-8a6eb255f57f-inventory\") pod \"4b772014-ade2-4ef1-9795-8a6eb255f57f\" (UID: \"4b772014-ade2-4ef1-9795-8a6eb255f57f\") " Dec 05 23:55:09 crc kubenswrapper[4734]: I1205 23:55:09.462593 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b772014-ade2-4ef1-9795-8a6eb255f57f-ssh-key\") pod \"4b772014-ade2-4ef1-9795-8a6eb255f57f\" (UID: \"4b772014-ade2-4ef1-9795-8a6eb255f57f\") " Dec 05 23:55:09 crc kubenswrapper[4734]: I1205 23:55:09.462735 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcfbq\" (UniqueName: \"kubernetes.io/projected/4b772014-ade2-4ef1-9795-8a6eb255f57f-kube-api-access-mcfbq\") pod \"4b772014-ade2-4ef1-9795-8a6eb255f57f\" (UID: \"4b772014-ade2-4ef1-9795-8a6eb255f57f\") " Dec 05 23:55:09 crc kubenswrapper[4734]: I1205 23:55:09.462786 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b772014-ade2-4ef1-9795-8a6eb255f57f-ovn-combined-ca-bundle\") pod \"4b772014-ade2-4ef1-9795-8a6eb255f57f\" (UID: \"4b772014-ade2-4ef1-9795-8a6eb255f57f\") " Dec 05 23:55:09 crc kubenswrapper[4734]: I1205 23:55:09.470609 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b772014-ade2-4ef1-9795-8a6eb255f57f-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "4b772014-ade2-4ef1-9795-8a6eb255f57f" (UID: "4b772014-ade2-4ef1-9795-8a6eb255f57f"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:55:09 crc kubenswrapper[4734]: I1205 23:55:09.471457 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b772014-ade2-4ef1-9795-8a6eb255f57f-kube-api-access-mcfbq" (OuterVolumeSpecName: "kube-api-access-mcfbq") pod "4b772014-ade2-4ef1-9795-8a6eb255f57f" (UID: "4b772014-ade2-4ef1-9795-8a6eb255f57f"). InnerVolumeSpecName "kube-api-access-mcfbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:55:09 crc kubenswrapper[4734]: I1205 23:55:09.494900 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b772014-ade2-4ef1-9795-8a6eb255f57f-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "4b772014-ade2-4ef1-9795-8a6eb255f57f" (UID: "4b772014-ade2-4ef1-9795-8a6eb255f57f"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 23:55:09 crc kubenswrapper[4734]: I1205 23:55:09.498515 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b772014-ade2-4ef1-9795-8a6eb255f57f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4b772014-ade2-4ef1-9795-8a6eb255f57f" (UID: "4b772014-ade2-4ef1-9795-8a6eb255f57f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:55:09 crc kubenswrapper[4734]: I1205 23:55:09.500001 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b772014-ade2-4ef1-9795-8a6eb255f57f-inventory" (OuterVolumeSpecName: "inventory") pod "4b772014-ade2-4ef1-9795-8a6eb255f57f" (UID: "4b772014-ade2-4ef1-9795-8a6eb255f57f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:55:09 crc kubenswrapper[4734]: I1205 23:55:09.566242 4734 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4b772014-ade2-4ef1-9795-8a6eb255f57f-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 23:55:09 crc kubenswrapper[4734]: I1205 23:55:09.566287 4734 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b772014-ade2-4ef1-9795-8a6eb255f57f-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 23:55:09 crc kubenswrapper[4734]: I1205 23:55:09.566296 4734 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b772014-ade2-4ef1-9795-8a6eb255f57f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 23:55:09 crc kubenswrapper[4734]: I1205 23:55:09.566309 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcfbq\" (UniqueName: \"kubernetes.io/projected/4b772014-ade2-4ef1-9795-8a6eb255f57f-kube-api-access-mcfbq\") on node \"crc\" DevicePath \"\"" Dec 05 23:55:09 crc kubenswrapper[4734]: I1205 23:55:09.566320 4734 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b772014-ade2-4ef1-9795-8a6eb255f57f-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:55:09 crc kubenswrapper[4734]: I1205 23:55:09.906391 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dxg77" event={"ID":"4b772014-ade2-4ef1-9795-8a6eb255f57f","Type":"ContainerDied","Data":"8933bfc324b15dbf4fe1c815523fd1bd66c155a952151f06d289ff72fd2d462e"} Dec 05 23:55:09 crc kubenswrapper[4734]: I1205 23:55:09.906458 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8933bfc324b15dbf4fe1c815523fd1bd66c155a952151f06d289ff72fd2d462e" Dec 05 23:55:09 crc kubenswrapper[4734]: I1205 23:55:09.906583 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-dxg77" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.023731 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2"] Dec 05 23:55:10 crc kubenswrapper[4734]: E1205 23:55:10.024265 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b772014-ade2-4ef1-9795-8a6eb255f57f" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.024473 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b772014-ade2-4ef1-9795-8a6eb255f57f" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.024724 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b772014-ade2-4ef1-9795-8a6eb255f57f" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.027060 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.033606 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.033849 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.034016 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.034163 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.034398 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.034590 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gsdqx" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.037373 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2"] Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.077462 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2\" (UID: \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.077675 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2\" (UID: \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.077736 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2\" (UID: \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.077833 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2\" (UID: \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.077913 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2\" (UID: \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.077943 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2wrz\" (UniqueName: \"kubernetes.io/projected/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-kube-api-access-d2wrz\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2\" (UID: \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.180800 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2\" (UID: \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.181274 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2wrz\" (UniqueName: \"kubernetes.io/projected/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-kube-api-access-d2wrz\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2\" (UID: \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.181481 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2\" (UID: \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.181605 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2\" (UID: \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.181751 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2\" (UID: \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.181883 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2\" (UID: \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.189157 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2\" (UID: \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.189413 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2\" (UID: \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.189497 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2\" (UID: \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.190284 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2\" (UID: \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.191146 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2\" (UID: \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.208325 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2wrz\" (UniqueName: \"kubernetes.io/projected/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-kube-api-access-d2wrz\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2\" (UID: \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.358916 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2" Dec 05 23:55:10 crc kubenswrapper[4734]: I1205 23:55:10.984011 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2"] Dec 05 23:55:11 crc kubenswrapper[4734]: I1205 23:55:11.937584 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2" event={"ID":"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453","Type":"ContainerStarted","Data":"39e6ae774ae73483ab0b7bd39b3fd42ac82ac20a9b7f0bf4dc894b2a3d497325"} Dec 05 23:55:11 crc kubenswrapper[4734]: I1205 23:55:11.938434 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2" event={"ID":"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453","Type":"ContainerStarted","Data":"e95dc2b8f0ec18bac9f7a0c76434d4815f5a701d8dd8b7e5231f7ca9dbe1fdae"} Dec 05 23:55:11 crc kubenswrapper[4734]: I1205 23:55:11.963728 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2" podStartSLOduration=2.485329842 podStartE2EDuration="2.963564613s" podCreationTimestamp="2025-12-05 23:55:09 +0000 UTC" firstStartedPulling="2025-12-05 23:55:10.998718566 +0000 UTC m=+2131.682122832" lastFinishedPulling="2025-12-05 23:55:11.476953327 +0000 UTC m=+2132.160357603" observedRunningTime="2025-12-05 23:55:11.956258775 +0000 UTC m=+2132.639663051" watchObservedRunningTime="2025-12-05 23:55:11.963564613 +0000 UTC m=+2132.646968899" Dec 05 23:55:48 crc kubenswrapper[4734]: I1205 23:55:48.041960 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5jmk2"] Dec 05 23:55:48 crc kubenswrapper[4734]: I1205 23:55:48.045094 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jmk2" Dec 05 23:55:48 crc kubenswrapper[4734]: I1205 23:55:48.055672 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jmk2"] Dec 05 23:55:48 crc kubenswrapper[4734]: I1205 23:55:48.171172 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww4wf\" (UniqueName: \"kubernetes.io/projected/f80d763b-7e9c-40a0-be4e-95ba6b628021-kube-api-access-ww4wf\") pod \"redhat-marketplace-5jmk2\" (UID: \"f80d763b-7e9c-40a0-be4e-95ba6b628021\") " pod="openshift-marketplace/redhat-marketplace-5jmk2" Dec 05 23:55:48 crc kubenswrapper[4734]: I1205 23:55:48.171327 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f80d763b-7e9c-40a0-be4e-95ba6b628021-catalog-content\") pod \"redhat-marketplace-5jmk2\" (UID: \"f80d763b-7e9c-40a0-be4e-95ba6b628021\") " pod="openshift-marketplace/redhat-marketplace-5jmk2" Dec 05 23:55:48 crc kubenswrapper[4734]: I1205 23:55:48.171492 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f80d763b-7e9c-40a0-be4e-95ba6b628021-utilities\") pod \"redhat-marketplace-5jmk2\" (UID: \"f80d763b-7e9c-40a0-be4e-95ba6b628021\") " pod="openshift-marketplace/redhat-marketplace-5jmk2" Dec 05 23:55:48 crc kubenswrapper[4734]: I1205 23:55:48.272788 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww4wf\" (UniqueName: \"kubernetes.io/projected/f80d763b-7e9c-40a0-be4e-95ba6b628021-kube-api-access-ww4wf\") pod \"redhat-marketplace-5jmk2\" (UID: \"f80d763b-7e9c-40a0-be4e-95ba6b628021\") " pod="openshift-marketplace/redhat-marketplace-5jmk2" Dec 05 23:55:48 crc kubenswrapper[4734]: I1205 23:55:48.273204 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f80d763b-7e9c-40a0-be4e-95ba6b628021-catalog-content\") pod \"redhat-marketplace-5jmk2\" (UID: \"f80d763b-7e9c-40a0-be4e-95ba6b628021\") " pod="openshift-marketplace/redhat-marketplace-5jmk2" Dec 05 23:55:48 crc kubenswrapper[4734]: I1205 23:55:48.273328 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f80d763b-7e9c-40a0-be4e-95ba6b628021-utilities\") pod \"redhat-marketplace-5jmk2\" (UID: \"f80d763b-7e9c-40a0-be4e-95ba6b628021\") " pod="openshift-marketplace/redhat-marketplace-5jmk2" Dec 05 23:55:48 crc kubenswrapper[4734]: I1205 23:55:48.273904 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f80d763b-7e9c-40a0-be4e-95ba6b628021-utilities\") pod \"redhat-marketplace-5jmk2\" (UID: \"f80d763b-7e9c-40a0-be4e-95ba6b628021\") " pod="openshift-marketplace/redhat-marketplace-5jmk2" Dec 05 23:55:48 crc kubenswrapper[4734]: I1205 23:55:48.274046 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f80d763b-7e9c-40a0-be4e-95ba6b628021-catalog-content\") pod \"redhat-marketplace-5jmk2\" (UID: \"f80d763b-7e9c-40a0-be4e-95ba6b628021\") " pod="openshift-marketplace/redhat-marketplace-5jmk2" Dec 05 23:55:48 crc kubenswrapper[4734]: I1205 23:55:48.296051 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww4wf\" (UniqueName: \"kubernetes.io/projected/f80d763b-7e9c-40a0-be4e-95ba6b628021-kube-api-access-ww4wf\") pod \"redhat-marketplace-5jmk2\" (UID: \"f80d763b-7e9c-40a0-be4e-95ba6b628021\") " pod="openshift-marketplace/redhat-marketplace-5jmk2" Dec 05 23:55:48 crc kubenswrapper[4734]: I1205 23:55:48.374808 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jmk2" Dec 05 23:55:48 crc kubenswrapper[4734]: I1205 23:55:48.952923 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jmk2"] Dec 05 23:55:49 crc kubenswrapper[4734]: I1205 23:55:49.333202 4734 generic.go:334] "Generic (PLEG): container finished" podID="f80d763b-7e9c-40a0-be4e-95ba6b628021" containerID="e08128f837daf7ed45b26f86bc38b8efc2132c21b5d1e11022adcc7270e82c46" exitCode=0 Dec 05 23:55:49 crc kubenswrapper[4734]: I1205 23:55:49.333595 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jmk2" event={"ID":"f80d763b-7e9c-40a0-be4e-95ba6b628021","Type":"ContainerDied","Data":"e08128f837daf7ed45b26f86bc38b8efc2132c21b5d1e11022adcc7270e82c46"} Dec 05 23:55:49 crc kubenswrapper[4734]: I1205 23:55:49.333851 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jmk2" event={"ID":"f80d763b-7e9c-40a0-be4e-95ba6b628021","Type":"ContainerStarted","Data":"5fe342c4ab3930fe028fd9789a1463661899fcdb017613ad9f9820c57a4ebea8"} Dec 05 23:55:50 crc kubenswrapper[4734]: I1205 23:55:50.347619 4734 generic.go:334] "Generic (PLEG): container finished" podID="f80d763b-7e9c-40a0-be4e-95ba6b628021" containerID="c7cfd5621783777a8f651344912bb88c25061265503501ef7db7712a7a2970f0" exitCode=0 Dec 05 23:55:50 crc kubenswrapper[4734]: I1205 23:55:50.347747 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jmk2" event={"ID":"f80d763b-7e9c-40a0-be4e-95ba6b628021","Type":"ContainerDied","Data":"c7cfd5621783777a8f651344912bb88c25061265503501ef7db7712a7a2970f0"} Dec 05 23:55:51 crc kubenswrapper[4734]: I1205 23:55:51.362917 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jmk2" event={"ID":"f80d763b-7e9c-40a0-be4e-95ba6b628021","Type":"ContainerStarted","Data":"562d6c27c82ff186c04c726fcdd712518b4e47c66d8266e8cc3415456b771269"} Dec 05 23:55:51 crc kubenswrapper[4734]: I1205 23:55:51.392477 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5jmk2" podStartSLOduration=1.947783279 podStartE2EDuration="3.392447012s" podCreationTimestamp="2025-12-05 23:55:48 +0000 UTC" firstStartedPulling="2025-12-05 23:55:49.337174561 +0000 UTC m=+2170.020578827" lastFinishedPulling="2025-12-05 23:55:50.781838284 +0000 UTC m=+2171.465242560" observedRunningTime="2025-12-05 23:55:51.386282191 +0000 UTC m=+2172.069686477" watchObservedRunningTime="2025-12-05 23:55:51.392447012 +0000 UTC m=+2172.075851288" Dec 05 23:55:58 crc kubenswrapper[4734]: I1205 23:55:58.375173 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5jmk2" Dec 05 23:55:58 crc kubenswrapper[4734]: I1205 23:55:58.376023 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5jmk2" Dec 05 23:55:58 crc kubenswrapper[4734]: I1205 23:55:58.425802 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5jmk2" Dec 05 23:55:58 crc kubenswrapper[4734]: I1205 23:55:58.493811 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5jmk2" Dec 05 23:55:58 crc kubenswrapper[4734]: I1205 23:55:58.666348 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jmk2"] Dec 05 23:56:00 crc kubenswrapper[4734]: I1205 23:56:00.453311 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5jmk2" podUID="f80d763b-7e9c-40a0-be4e-95ba6b628021" containerName="registry-server" containerID="cri-o://562d6c27c82ff186c04c726fcdd712518b4e47c66d8266e8cc3415456b771269" gracePeriod=2 Dec 05 23:56:01 crc kubenswrapper[4734]: I1205 23:56:01.052584 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jmk2" Dec 05 23:56:01 crc kubenswrapper[4734]: I1205 23:56:01.242718 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f80d763b-7e9c-40a0-be4e-95ba6b628021-utilities\") pod \"f80d763b-7e9c-40a0-be4e-95ba6b628021\" (UID: \"f80d763b-7e9c-40a0-be4e-95ba6b628021\") " Dec 05 23:56:01 crc kubenswrapper[4734]: I1205 23:56:01.243321 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f80d763b-7e9c-40a0-be4e-95ba6b628021-catalog-content\") pod \"f80d763b-7e9c-40a0-be4e-95ba6b628021\" (UID: \"f80d763b-7e9c-40a0-be4e-95ba6b628021\") " Dec 05 23:56:01 crc kubenswrapper[4734]: I1205 23:56:01.243520 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww4wf\" (UniqueName: \"kubernetes.io/projected/f80d763b-7e9c-40a0-be4e-95ba6b628021-kube-api-access-ww4wf\") pod \"f80d763b-7e9c-40a0-be4e-95ba6b628021\" (UID: \"f80d763b-7e9c-40a0-be4e-95ba6b628021\") " Dec 05 23:56:01 crc kubenswrapper[4734]: I1205 23:56:01.244256 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f80d763b-7e9c-40a0-be4e-95ba6b628021-utilities" (OuterVolumeSpecName: "utilities") pod "f80d763b-7e9c-40a0-be4e-95ba6b628021" (UID: "f80d763b-7e9c-40a0-be4e-95ba6b628021"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:56:01 crc kubenswrapper[4734]: I1205 23:56:01.252821 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f80d763b-7e9c-40a0-be4e-95ba6b628021-kube-api-access-ww4wf" (OuterVolumeSpecName: "kube-api-access-ww4wf") pod "f80d763b-7e9c-40a0-be4e-95ba6b628021" (UID: "f80d763b-7e9c-40a0-be4e-95ba6b628021"). InnerVolumeSpecName "kube-api-access-ww4wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:56:01 crc kubenswrapper[4734]: I1205 23:56:01.264704 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f80d763b-7e9c-40a0-be4e-95ba6b628021-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f80d763b-7e9c-40a0-be4e-95ba6b628021" (UID: "f80d763b-7e9c-40a0-be4e-95ba6b628021"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:56:01 crc kubenswrapper[4734]: I1205 23:56:01.347046 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f80d763b-7e9c-40a0-be4e-95ba6b628021-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:56:01 crc kubenswrapper[4734]: I1205 23:56:01.347279 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww4wf\" (UniqueName: \"kubernetes.io/projected/f80d763b-7e9c-40a0-be4e-95ba6b628021-kube-api-access-ww4wf\") on node \"crc\" DevicePath \"\"" Dec 05 23:56:01 crc kubenswrapper[4734]: I1205 23:56:01.347292 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f80d763b-7e9c-40a0-be4e-95ba6b628021-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:56:01 crc kubenswrapper[4734]: I1205 23:56:01.464889 4734 generic.go:334] "Generic (PLEG): container finished" podID="f80d763b-7e9c-40a0-be4e-95ba6b628021" containerID="562d6c27c82ff186c04c726fcdd712518b4e47c66d8266e8cc3415456b771269" exitCode=0 Dec 05 23:56:01 crc kubenswrapper[4734]: I1205 23:56:01.464947 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jmk2" event={"ID":"f80d763b-7e9c-40a0-be4e-95ba6b628021","Type":"ContainerDied","Data":"562d6c27c82ff186c04c726fcdd712518b4e47c66d8266e8cc3415456b771269"} Dec 05 23:56:01 crc kubenswrapper[4734]: I1205 23:56:01.464982 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jmk2" event={"ID":"f80d763b-7e9c-40a0-be4e-95ba6b628021","Type":"ContainerDied","Data":"5fe342c4ab3930fe028fd9789a1463661899fcdb017613ad9f9820c57a4ebea8"} Dec 05 23:56:01 crc kubenswrapper[4734]: I1205 23:56:01.465001 4734 scope.go:117] "RemoveContainer" containerID="562d6c27c82ff186c04c726fcdd712518b4e47c66d8266e8cc3415456b771269" Dec 05 23:56:01 crc kubenswrapper[4734]: I1205 23:56:01.465014 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jmk2" Dec 05 23:56:01 crc kubenswrapper[4734]: I1205 23:56:01.511340 4734 scope.go:117] "RemoveContainer" containerID="c7cfd5621783777a8f651344912bb88c25061265503501ef7db7712a7a2970f0" Dec 05 23:56:01 crc kubenswrapper[4734]: I1205 23:56:01.517095 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jmk2"] Dec 05 23:56:01 crc kubenswrapper[4734]: I1205 23:56:01.526951 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jmk2"] Dec 05 23:56:01 crc kubenswrapper[4734]: I1205 23:56:01.533024 4734 scope.go:117] "RemoveContainer" containerID="e08128f837daf7ed45b26f86bc38b8efc2132c21b5d1e11022adcc7270e82c46" Dec 05 23:56:01 crc kubenswrapper[4734]: I1205 23:56:01.585653 4734 scope.go:117] "RemoveContainer" containerID="562d6c27c82ff186c04c726fcdd712518b4e47c66d8266e8cc3415456b771269" Dec 05 23:56:01 crc kubenswrapper[4734]: E1205 23:56:01.587352 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"562d6c27c82ff186c04c726fcdd712518b4e47c66d8266e8cc3415456b771269\": container with ID starting with 562d6c27c82ff186c04c726fcdd712518b4e47c66d8266e8cc3415456b771269 not found: ID does not exist" containerID="562d6c27c82ff186c04c726fcdd712518b4e47c66d8266e8cc3415456b771269" Dec 05 23:56:01 crc kubenswrapper[4734]: I1205 23:56:01.587430 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"562d6c27c82ff186c04c726fcdd712518b4e47c66d8266e8cc3415456b771269"} err="failed to get container status \"562d6c27c82ff186c04c726fcdd712518b4e47c66d8266e8cc3415456b771269\": rpc error: code = NotFound desc = could not find container \"562d6c27c82ff186c04c726fcdd712518b4e47c66d8266e8cc3415456b771269\": container with ID starting with 562d6c27c82ff186c04c726fcdd712518b4e47c66d8266e8cc3415456b771269 not found: ID does not exist" Dec 05 23:56:01 crc kubenswrapper[4734]: I1205 23:56:01.587472 4734 scope.go:117] "RemoveContainer" containerID="c7cfd5621783777a8f651344912bb88c25061265503501ef7db7712a7a2970f0" Dec 05 23:56:01 crc kubenswrapper[4734]: E1205 23:56:01.588142 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7cfd5621783777a8f651344912bb88c25061265503501ef7db7712a7a2970f0\": container with ID starting with c7cfd5621783777a8f651344912bb88c25061265503501ef7db7712a7a2970f0 not found: ID does not exist" containerID="c7cfd5621783777a8f651344912bb88c25061265503501ef7db7712a7a2970f0" Dec 05 23:56:01 crc kubenswrapper[4734]: I1205 23:56:01.588192 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7cfd5621783777a8f651344912bb88c25061265503501ef7db7712a7a2970f0"} err="failed to get container status \"c7cfd5621783777a8f651344912bb88c25061265503501ef7db7712a7a2970f0\": rpc error: code = NotFound desc = could not find container \"c7cfd5621783777a8f651344912bb88c25061265503501ef7db7712a7a2970f0\": container with ID starting with c7cfd5621783777a8f651344912bb88c25061265503501ef7db7712a7a2970f0 not found: ID does not exist" Dec 05 23:56:01 crc kubenswrapper[4734]: I1205 23:56:01.588225 4734 scope.go:117] "RemoveContainer" containerID="e08128f837daf7ed45b26f86bc38b8efc2132c21b5d1e11022adcc7270e82c46" Dec 05 23:56:01 crc kubenswrapper[4734]: E1205 23:56:01.588599 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e08128f837daf7ed45b26f86bc38b8efc2132c21b5d1e11022adcc7270e82c46\": container with ID starting with e08128f837daf7ed45b26f86bc38b8efc2132c21b5d1e11022adcc7270e82c46 not found: ID does not exist" containerID="e08128f837daf7ed45b26f86bc38b8efc2132c21b5d1e11022adcc7270e82c46" Dec 05 23:56:01 crc kubenswrapper[4734]: I1205 23:56:01.588627 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e08128f837daf7ed45b26f86bc38b8efc2132c21b5d1e11022adcc7270e82c46"} err="failed to get container status \"e08128f837daf7ed45b26f86bc38b8efc2132c21b5d1e11022adcc7270e82c46\": rpc error: code = NotFound desc = could not find container \"e08128f837daf7ed45b26f86bc38b8efc2132c21b5d1e11022adcc7270e82c46\": container with ID starting with e08128f837daf7ed45b26f86bc38b8efc2132c21b5d1e11022adcc7270e82c46 not found: ID does not exist" Dec 05 23:56:01 crc kubenswrapper[4734]: I1205 23:56:01.630018 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f80d763b-7e9c-40a0-be4e-95ba6b628021" path="/var/lib/kubelet/pods/f80d763b-7e9c-40a0-be4e-95ba6b628021/volumes" Dec 05 23:56:03 crc kubenswrapper[4734]: I1205 23:56:03.489168 4734 generic.go:334] "Generic (PLEG): container finished" podID="e4c89d06-2d3b-47f8-bc2e-fa34a9d89453" containerID="39e6ae774ae73483ab0b7bd39b3fd42ac82ac20a9b7f0bf4dc894b2a3d497325" exitCode=0 Dec 05 23:56:03 crc kubenswrapper[4734]: I1205 23:56:03.489254 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2" event={"ID":"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453","Type":"ContainerDied","Data":"39e6ae774ae73483ab0b7bd39b3fd42ac82ac20a9b7f0bf4dc894b2a3d497325"} Dec 05 23:56:04 crc kubenswrapper[4734]: I1205 23:56:04.967828 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.143112 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-inventory\") pod \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\" (UID: \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\") " Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.143191 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2wrz\" (UniqueName: \"kubernetes.io/projected/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-kube-api-access-d2wrz\") pod \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\" (UID: \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\") " Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.143498 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-ssh-key\") pod \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\" (UID: \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\") " Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.143577 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-neutron-ovn-metadata-agent-neutron-config-0\") pod \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\" (UID: \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\") " Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.143672 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-nova-metadata-neutron-config-0\") pod \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\" (UID: \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\") " Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.143708 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-neutron-metadata-combined-ca-bundle\") pod \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\" (UID: \"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453\") " Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.150698 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e4c89d06-2d3b-47f8-bc2e-fa34a9d89453" (UID: "e4c89d06-2d3b-47f8-bc2e-fa34a9d89453"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.153476 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-kube-api-access-d2wrz" (OuterVolumeSpecName: "kube-api-access-d2wrz") pod "e4c89d06-2d3b-47f8-bc2e-fa34a9d89453" (UID: "e4c89d06-2d3b-47f8-bc2e-fa34a9d89453"). InnerVolumeSpecName "kube-api-access-d2wrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.179916 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "e4c89d06-2d3b-47f8-bc2e-fa34a9d89453" (UID: "e4c89d06-2d3b-47f8-bc2e-fa34a9d89453"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.182565 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "e4c89d06-2d3b-47f8-bc2e-fa34a9d89453" (UID: "e4c89d06-2d3b-47f8-bc2e-fa34a9d89453"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.185938 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-inventory" (OuterVolumeSpecName: "inventory") pod "e4c89d06-2d3b-47f8-bc2e-fa34a9d89453" (UID: "e4c89d06-2d3b-47f8-bc2e-fa34a9d89453"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.194902 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e4c89d06-2d3b-47f8-bc2e-fa34a9d89453" (UID: "e4c89d06-2d3b-47f8-bc2e-fa34a9d89453"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.247057 4734 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.247098 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2wrz\" (UniqueName: \"kubernetes.io/projected/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-kube-api-access-d2wrz\") on node \"crc\" DevicePath \"\"" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.247109 4734 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.247121 4734 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.247134 4734 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.247145 4734 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c89d06-2d3b-47f8-bc2e-fa34a9d89453-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.516048 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2" event={"ID":"e4c89d06-2d3b-47f8-bc2e-fa34a9d89453","Type":"ContainerDied","Data":"e95dc2b8f0ec18bac9f7a0c76434d4815f5a701d8dd8b7e5231f7ca9dbe1fdae"} Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.516115 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.516130 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e95dc2b8f0ec18bac9f7a0c76434d4815f5a701d8dd8b7e5231f7ca9dbe1fdae" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.627131 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk"] Dec 05 23:56:05 crc kubenswrapper[4734]: E1205 23:56:05.627675 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f80d763b-7e9c-40a0-be4e-95ba6b628021" containerName="extract-content" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.627704 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="f80d763b-7e9c-40a0-be4e-95ba6b628021" containerName="extract-content" Dec 05 23:56:05 crc kubenswrapper[4734]: E1205 23:56:05.627728 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4c89d06-2d3b-47f8-bc2e-fa34a9d89453" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.627741 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4c89d06-2d3b-47f8-bc2e-fa34a9d89453" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 05 23:56:05 crc kubenswrapper[4734]: E1205 23:56:05.627758 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f80d763b-7e9c-40a0-be4e-95ba6b628021" containerName="extract-utilities" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.627766 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="f80d763b-7e9c-40a0-be4e-95ba6b628021" containerName="extract-utilities" Dec 05 23:56:05 crc kubenswrapper[4734]: E1205 23:56:05.627785 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f80d763b-7e9c-40a0-be4e-95ba6b628021" containerName="registry-server" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.627793 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="f80d763b-7e9c-40a0-be4e-95ba6b628021" containerName="registry-server" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.628045 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4c89d06-2d3b-47f8-bc2e-fa34a9d89453" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.628073 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="f80d763b-7e9c-40a0-be4e-95ba6b628021" containerName="registry-server" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.628943 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.631772 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.632833 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.633499 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.633842 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gsdqx" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.636309 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.637074 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk"] Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.757244 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f32997-f801-4f60-b010-aaff637a8292-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk\" (UID: \"85f32997-f801-4f60-b010-aaff637a8292\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.757761 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/85f32997-f801-4f60-b010-aaff637a8292-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk\" (UID: \"85f32997-f801-4f60-b010-aaff637a8292\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.757800 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85f32997-f801-4f60-b010-aaff637a8292-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk\" (UID: \"85f32997-f801-4f60-b010-aaff637a8292\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.758023 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85f32997-f801-4f60-b010-aaff637a8292-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk\" (UID: \"85f32997-f801-4f60-b010-aaff637a8292\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.758093 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crnzl\" (UniqueName: \"kubernetes.io/projected/85f32997-f801-4f60-b010-aaff637a8292-kube-api-access-crnzl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk\" (UID: \"85f32997-f801-4f60-b010-aaff637a8292\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.861002 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f32997-f801-4f60-b010-aaff637a8292-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk\" (UID: \"85f32997-f801-4f60-b010-aaff637a8292\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.861081 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/85f32997-f801-4f60-b010-aaff637a8292-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk\" (UID: \"85f32997-f801-4f60-b010-aaff637a8292\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.861110 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85f32997-f801-4f60-b010-aaff637a8292-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk\" (UID: \"85f32997-f801-4f60-b010-aaff637a8292\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.861151 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85f32997-f801-4f60-b010-aaff637a8292-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk\" (UID: \"85f32997-f801-4f60-b010-aaff637a8292\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.861178 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crnzl\" (UniqueName: \"kubernetes.io/projected/85f32997-f801-4f60-b010-aaff637a8292-kube-api-access-crnzl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk\" (UID: \"85f32997-f801-4f60-b010-aaff637a8292\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.866968 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85f32997-f801-4f60-b010-aaff637a8292-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk\" (UID: \"85f32997-f801-4f60-b010-aaff637a8292\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.866989 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/85f32997-f801-4f60-b010-aaff637a8292-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk\" (UID: \"85f32997-f801-4f60-b010-aaff637a8292\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.867353 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85f32997-f801-4f60-b010-aaff637a8292-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk\" (UID: \"85f32997-f801-4f60-b010-aaff637a8292\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.868209 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f32997-f801-4f60-b010-aaff637a8292-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk\" (UID: \"85f32997-f801-4f60-b010-aaff637a8292\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.883208 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crnzl\" (UniqueName: \"kubernetes.io/projected/85f32997-f801-4f60-b010-aaff637a8292-kube-api-access-crnzl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk\" (UID: \"85f32997-f801-4f60-b010-aaff637a8292\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk" Dec 05 23:56:05 crc kubenswrapper[4734]: I1205 23:56:05.948956 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk" Dec 05 23:56:06 crc kubenswrapper[4734]: I1205 23:56:06.501986 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk"] Dec 05 23:56:06 crc kubenswrapper[4734]: I1205 23:56:06.528506 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk" event={"ID":"85f32997-f801-4f60-b010-aaff637a8292","Type":"ContainerStarted","Data":"880a7d5e59069778fc05d1d251fa5c08e5cf66434b8b6dcd792a95603614120f"} Dec 05 23:56:07 crc kubenswrapper[4734]: I1205 23:56:07.539569 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk" event={"ID":"85f32997-f801-4f60-b010-aaff637a8292","Type":"ContainerStarted","Data":"ff6b4e9aedfa59715254dd80adc0423be5feb648f27ecf91368e37cd83027e31"} Dec 05 23:56:46 crc kubenswrapper[4734]: I1205 23:56:46.188496 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk" podStartSLOduration=40.778253345 podStartE2EDuration="41.188461917s" podCreationTimestamp="2025-12-05 23:56:05 +0000 UTC" firstStartedPulling="2025-12-05 23:56:06.512155084 +0000 UTC m=+2187.195559360" lastFinishedPulling="2025-12-05 23:56:06.922363656 +0000 UTC m=+2187.605767932" observedRunningTime="2025-12-05 23:56:07.57013381 +0000 UTC m=+2188.253538086" watchObservedRunningTime="2025-12-05 23:56:46.188461917 +0000 UTC m=+2226.871866213" Dec 05 23:56:46 crc kubenswrapper[4734]: I1205 23:56:46.203946 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zvx7t"] Dec 05 23:56:46 crc kubenswrapper[4734]: I1205 23:56:46.206915 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zvx7t" Dec 05 23:56:46 crc kubenswrapper[4734]: I1205 23:56:46.218115 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zvx7t"] Dec 05 23:56:46 crc kubenswrapper[4734]: I1205 23:56:46.250724 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcrm4\" (UniqueName: \"kubernetes.io/projected/d2310433-f1f4-4e9b-9baf-88055ca4c11b-kube-api-access-gcrm4\") pod \"certified-operators-zvx7t\" (UID: \"d2310433-f1f4-4e9b-9baf-88055ca4c11b\") " pod="openshift-marketplace/certified-operators-zvx7t" Dec 05 23:56:46 crc kubenswrapper[4734]: I1205 23:56:46.250831 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2310433-f1f4-4e9b-9baf-88055ca4c11b-utilities\") pod \"certified-operators-zvx7t\" (UID: \"d2310433-f1f4-4e9b-9baf-88055ca4c11b\") " pod="openshift-marketplace/certified-operators-zvx7t" Dec 05 23:56:46 crc kubenswrapper[4734]: I1205 23:56:46.250868 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2310433-f1f4-4e9b-9baf-88055ca4c11b-catalog-content\") pod \"certified-operators-zvx7t\" (UID: \"d2310433-f1f4-4e9b-9baf-88055ca4c11b\") " pod="openshift-marketplace/certified-operators-zvx7t" Dec 05 23:56:46 crc kubenswrapper[4734]: I1205 23:56:46.353952 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcrm4\" (UniqueName: \"kubernetes.io/projected/d2310433-f1f4-4e9b-9baf-88055ca4c11b-kube-api-access-gcrm4\") pod \"certified-operators-zvx7t\" (UID: \"d2310433-f1f4-4e9b-9baf-88055ca4c11b\") " pod="openshift-marketplace/certified-operators-zvx7t" Dec 05 23:56:46 crc kubenswrapper[4734]: I1205 23:56:46.354077 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2310433-f1f4-4e9b-9baf-88055ca4c11b-utilities\") pod \"certified-operators-zvx7t\" (UID: \"d2310433-f1f4-4e9b-9baf-88055ca4c11b\") " pod="openshift-marketplace/certified-operators-zvx7t" Dec 05 23:56:46 crc kubenswrapper[4734]: I1205 23:56:46.354116 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2310433-f1f4-4e9b-9baf-88055ca4c11b-catalog-content\") pod \"certified-operators-zvx7t\" (UID: \"d2310433-f1f4-4e9b-9baf-88055ca4c11b\") " pod="openshift-marketplace/certified-operators-zvx7t" Dec 05 23:56:46 crc kubenswrapper[4734]: I1205 23:56:46.354921 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2310433-f1f4-4e9b-9baf-88055ca4c11b-utilities\") pod \"certified-operators-zvx7t\" (UID: \"d2310433-f1f4-4e9b-9baf-88055ca4c11b\") " pod="openshift-marketplace/certified-operators-zvx7t" Dec 05 23:56:46 crc kubenswrapper[4734]: I1205 23:56:46.355007 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2310433-f1f4-4e9b-9baf-88055ca4c11b-catalog-content\") pod \"certified-operators-zvx7t\" (UID: \"d2310433-f1f4-4e9b-9baf-88055ca4c11b\") " pod="openshift-marketplace/certified-operators-zvx7t" Dec 05 23:56:46 crc kubenswrapper[4734]: I1205 23:56:46.380307 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcrm4\" (UniqueName: \"kubernetes.io/projected/d2310433-f1f4-4e9b-9baf-88055ca4c11b-kube-api-access-gcrm4\") pod \"certified-operators-zvx7t\" (UID: \"d2310433-f1f4-4e9b-9baf-88055ca4c11b\") " pod="openshift-marketplace/certified-operators-zvx7t" Dec 05 23:56:48 crc kubenswrapper[4734]: I1205 23:56:46.538396 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zvx7t" Dec 05 23:56:48 crc kubenswrapper[4734]: I1205 23:56:47.695025 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-wf4vr" podUID="aa5ccaa9-5087-4891-b255-a5135271a2a5" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.81:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 23:56:48 crc kubenswrapper[4734]: I1205 23:56:47.695398 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-wf4vr" podUID="aa5ccaa9-5087-4891-b255-a5135271a2a5" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.81:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 23:56:48 crc kubenswrapper[4734]: I1205 23:56:47.878780 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2fm4z" podUID="696f07ba-7c46-41f2-826f-890756824285" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.87:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 23:56:48 crc kubenswrapper[4734]: I1205 23:56:47.921839 4734 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2fm4z" podUID="696f07ba-7c46-41f2-826f-890756824285" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.87:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 23:56:49 crc kubenswrapper[4734]: I1205 23:56:49.245838 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zvx7t"] Dec 05 23:56:49 crc kubenswrapper[4734]: I1205 23:56:49.973599 4734 generic.go:334] "Generic (PLEG): container finished" podID="d2310433-f1f4-4e9b-9baf-88055ca4c11b" containerID="5b8cefaf30d18553d545c432d294e87223e607eb22626584335d14fe418d7c6f" exitCode=0 Dec 05 23:56:49 crc kubenswrapper[4734]: I1205 23:56:49.973887 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvx7t" event={"ID":"d2310433-f1f4-4e9b-9baf-88055ca4c11b","Type":"ContainerDied","Data":"5b8cefaf30d18553d545c432d294e87223e607eb22626584335d14fe418d7c6f"} Dec 05 23:56:49 crc kubenswrapper[4734]: I1205 23:56:49.974112 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvx7t" event={"ID":"d2310433-f1f4-4e9b-9baf-88055ca4c11b","Type":"ContainerStarted","Data":"b48a92942a7e1fa66bf39abd60998620d38483fec5c5cf94802723050df2de5c"} Dec 05 23:56:50 crc kubenswrapper[4734]: I1205 23:56:50.445381 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:56:50 crc kubenswrapper[4734]: I1205 23:56:50.445943 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:56:51 crc kubenswrapper[4734]: I1205 23:56:51.008006 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvx7t" event={"ID":"d2310433-f1f4-4e9b-9baf-88055ca4c11b","Type":"ContainerStarted","Data":"d84bf684e67ed936ebd52d58a4508685283ac8df99ce7181604547b9eedc2c86"} Dec 05 23:56:52 crc kubenswrapper[4734]: I1205 23:56:52.023579 4734 generic.go:334] "Generic (PLEG): container finished" podID="d2310433-f1f4-4e9b-9baf-88055ca4c11b" containerID="d84bf684e67ed936ebd52d58a4508685283ac8df99ce7181604547b9eedc2c86" exitCode=0 Dec 05 23:56:52 crc kubenswrapper[4734]: I1205 23:56:52.023716 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvx7t" event={"ID":"d2310433-f1f4-4e9b-9baf-88055ca4c11b","Type":"ContainerDied","Data":"d84bf684e67ed936ebd52d58a4508685283ac8df99ce7181604547b9eedc2c86"} Dec 05 23:56:52 crc kubenswrapper[4734]: I1205 23:56:52.024136 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvx7t" event={"ID":"d2310433-f1f4-4e9b-9baf-88055ca4c11b","Type":"ContainerStarted","Data":"10edc78580c2b2d5d01401583e28a227a28fb18a0479a3dc407c38092931c16c"} Dec 05 23:56:52 crc kubenswrapper[4734]: I1205 23:56:52.053356 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zvx7t" podStartSLOduration=4.600775947 podStartE2EDuration="6.053329692s" podCreationTimestamp="2025-12-05 23:56:46 +0000 UTC" firstStartedPulling="2025-12-05 23:56:49.978580066 +0000 UTC m=+2230.661984342" lastFinishedPulling="2025-12-05 23:56:51.431133811 +0000 UTC m=+2232.114538087" observedRunningTime="2025-12-05 23:56:52.044101966 +0000 UTC m=+2232.727506262" watchObservedRunningTime="2025-12-05 23:56:52.053329692 +0000 UTC m=+2232.736733968" Dec 05 23:56:56 crc kubenswrapper[4734]: I1205 23:56:56.538857 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zvx7t" Dec 05 23:56:56 crc kubenswrapper[4734]: I1205 23:56:56.539295 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zvx7t" Dec 05 23:56:56 crc kubenswrapper[4734]: I1205 23:56:56.591590 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zvx7t" Dec 05 23:56:57 crc kubenswrapper[4734]: I1205 23:56:57.127002 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zvx7t" Dec 05 23:56:57 crc kubenswrapper[4734]: I1205 23:56:57.183608 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zvx7t"] Dec 05 23:56:59 crc kubenswrapper[4734]: I1205 23:56:59.096224 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zvx7t" podUID="d2310433-f1f4-4e9b-9baf-88055ca4c11b" containerName="registry-server" containerID="cri-o://10edc78580c2b2d5d01401583e28a227a28fb18a0479a3dc407c38092931c16c" gracePeriod=2 Dec 05 23:56:59 crc kubenswrapper[4734]: E1205 23:56:59.289607 4734 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2310433_f1f4_4e9b_9baf_88055ca4c11b.slice/crio-10edc78580c2b2d5d01401583e28a227a28fb18a0479a3dc407c38092931c16c.scope\": RecentStats: unable to find data in memory cache]" Dec 05 23:57:00 crc kubenswrapper[4734]: I1205 23:57:00.106573 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zvx7t" Dec 05 23:57:00 crc kubenswrapper[4734]: I1205 23:57:00.109895 4734 generic.go:334] "Generic (PLEG): container finished" podID="d2310433-f1f4-4e9b-9baf-88055ca4c11b" containerID="10edc78580c2b2d5d01401583e28a227a28fb18a0479a3dc407c38092931c16c" exitCode=0 Dec 05 23:57:00 crc kubenswrapper[4734]: I1205 23:57:00.109959 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvx7t" event={"ID":"d2310433-f1f4-4e9b-9baf-88055ca4c11b","Type":"ContainerDied","Data":"10edc78580c2b2d5d01401583e28a227a28fb18a0479a3dc407c38092931c16c"} Dec 05 23:57:00 crc kubenswrapper[4734]: I1205 23:57:00.110023 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvx7t" event={"ID":"d2310433-f1f4-4e9b-9baf-88055ca4c11b","Type":"ContainerDied","Data":"b48a92942a7e1fa66bf39abd60998620d38483fec5c5cf94802723050df2de5c"} Dec 05 23:57:00 crc kubenswrapper[4734]: I1205 23:57:00.110052 4734 scope.go:117] "RemoveContainer" containerID="10edc78580c2b2d5d01401583e28a227a28fb18a0479a3dc407c38092931c16c" Dec 05 23:57:00 crc kubenswrapper[4734]: I1205 23:57:00.142399 4734 scope.go:117] "RemoveContainer" containerID="d84bf684e67ed936ebd52d58a4508685283ac8df99ce7181604547b9eedc2c86" Dec 05 23:57:00 crc kubenswrapper[4734]: I1205 23:57:00.176841 4734 scope.go:117] "RemoveContainer" containerID="5b8cefaf30d18553d545c432d294e87223e607eb22626584335d14fe418d7c6f" Dec 05 23:57:00 crc kubenswrapper[4734]: I1205 23:57:00.229967 4734 scope.go:117] "RemoveContainer" containerID="10edc78580c2b2d5d01401583e28a227a28fb18a0479a3dc407c38092931c16c" Dec 05 23:57:00 crc kubenswrapper[4734]: E1205 23:57:00.230697 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10edc78580c2b2d5d01401583e28a227a28fb18a0479a3dc407c38092931c16c\": container with ID starting with 10edc78580c2b2d5d01401583e28a227a28fb18a0479a3dc407c38092931c16c not found: ID does not exist" containerID="10edc78580c2b2d5d01401583e28a227a28fb18a0479a3dc407c38092931c16c" Dec 05 23:57:00 crc kubenswrapper[4734]: I1205 23:57:00.230834 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10edc78580c2b2d5d01401583e28a227a28fb18a0479a3dc407c38092931c16c"} err="failed to get container status \"10edc78580c2b2d5d01401583e28a227a28fb18a0479a3dc407c38092931c16c\": rpc error: code = NotFound desc = could not find container \"10edc78580c2b2d5d01401583e28a227a28fb18a0479a3dc407c38092931c16c\": container with ID starting with 10edc78580c2b2d5d01401583e28a227a28fb18a0479a3dc407c38092931c16c not found: ID does not exist" Dec 05 23:57:00 crc kubenswrapper[4734]: I1205 23:57:00.230959 4734 scope.go:117] "RemoveContainer" containerID="d84bf684e67ed936ebd52d58a4508685283ac8df99ce7181604547b9eedc2c86" Dec 05 23:57:00 crc kubenswrapper[4734]: E1205 23:57:00.231797 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d84bf684e67ed936ebd52d58a4508685283ac8df99ce7181604547b9eedc2c86\": container with ID starting with d84bf684e67ed936ebd52d58a4508685283ac8df99ce7181604547b9eedc2c86 not found: ID does not exist" containerID="d84bf684e67ed936ebd52d58a4508685283ac8df99ce7181604547b9eedc2c86" Dec 05 23:57:00 crc kubenswrapper[4734]: I1205 23:57:00.231840 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d84bf684e67ed936ebd52d58a4508685283ac8df99ce7181604547b9eedc2c86"} err="failed to get container status \"d84bf684e67ed936ebd52d58a4508685283ac8df99ce7181604547b9eedc2c86\": rpc error: code = NotFound desc = could not find container \"d84bf684e67ed936ebd52d58a4508685283ac8df99ce7181604547b9eedc2c86\": container with ID starting with d84bf684e67ed936ebd52d58a4508685283ac8df99ce7181604547b9eedc2c86 not found: ID does not exist" Dec 05 23:57:00 crc kubenswrapper[4734]: I1205 23:57:00.231876 4734 scope.go:117] "RemoveContainer" containerID="5b8cefaf30d18553d545c432d294e87223e607eb22626584335d14fe418d7c6f" Dec 05 23:57:00 crc kubenswrapper[4734]: E1205 23:57:00.232246 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b8cefaf30d18553d545c432d294e87223e607eb22626584335d14fe418d7c6f\": container with ID starting with 5b8cefaf30d18553d545c432d294e87223e607eb22626584335d14fe418d7c6f not found: ID does not exist" containerID="5b8cefaf30d18553d545c432d294e87223e607eb22626584335d14fe418d7c6f" Dec 05 23:57:00 crc kubenswrapper[4734]: I1205 23:57:00.232268 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b8cefaf30d18553d545c432d294e87223e607eb22626584335d14fe418d7c6f"} err="failed to get container status \"5b8cefaf30d18553d545c432d294e87223e607eb22626584335d14fe418d7c6f\": rpc error: code = NotFound desc = could not find container \"5b8cefaf30d18553d545c432d294e87223e607eb22626584335d14fe418d7c6f\": container with ID starting with 5b8cefaf30d18553d545c432d294e87223e607eb22626584335d14fe418d7c6f not found: ID does not exist" Dec 05 23:57:00 crc kubenswrapper[4734]: I1205 23:57:00.281409 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2310433-f1f4-4e9b-9baf-88055ca4c11b-utilities\") pod \"d2310433-f1f4-4e9b-9baf-88055ca4c11b\" (UID: \"d2310433-f1f4-4e9b-9baf-88055ca4c11b\") " Dec 05 23:57:00 crc kubenswrapper[4734]: I1205 23:57:00.281582 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2310433-f1f4-4e9b-9baf-88055ca4c11b-catalog-content\") pod \"d2310433-f1f4-4e9b-9baf-88055ca4c11b\" (UID: \"d2310433-f1f4-4e9b-9baf-88055ca4c11b\") " Dec 05 23:57:00 crc kubenswrapper[4734]: I1205 23:57:00.281629 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcrm4\" (UniqueName: \"kubernetes.io/projected/d2310433-f1f4-4e9b-9baf-88055ca4c11b-kube-api-access-gcrm4\") pod \"d2310433-f1f4-4e9b-9baf-88055ca4c11b\" (UID: \"d2310433-f1f4-4e9b-9baf-88055ca4c11b\") " Dec 05 23:57:00 crc kubenswrapper[4734]: I1205 23:57:00.283036 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2310433-f1f4-4e9b-9baf-88055ca4c11b-utilities" (OuterVolumeSpecName: "utilities") pod "d2310433-f1f4-4e9b-9baf-88055ca4c11b" (UID: "d2310433-f1f4-4e9b-9baf-88055ca4c11b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:57:00 crc kubenswrapper[4734]: I1205 23:57:00.291019 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2310433-f1f4-4e9b-9baf-88055ca4c11b-kube-api-access-gcrm4" (OuterVolumeSpecName: "kube-api-access-gcrm4") pod "d2310433-f1f4-4e9b-9baf-88055ca4c11b" (UID: "d2310433-f1f4-4e9b-9baf-88055ca4c11b"). InnerVolumeSpecName "kube-api-access-gcrm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:57:00 crc kubenswrapper[4734]: I1205 23:57:00.342130 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2310433-f1f4-4e9b-9baf-88055ca4c11b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2310433-f1f4-4e9b-9baf-88055ca4c11b" (UID: "d2310433-f1f4-4e9b-9baf-88055ca4c11b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:57:00 crc kubenswrapper[4734]: I1205 23:57:00.384945 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2310433-f1f4-4e9b-9baf-88055ca4c11b-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:57:00 crc kubenswrapper[4734]: I1205 23:57:00.384999 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2310433-f1f4-4e9b-9baf-88055ca4c11b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:57:00 crc kubenswrapper[4734]: I1205 23:57:00.385016 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcrm4\" (UniqueName: \"kubernetes.io/projected/d2310433-f1f4-4e9b-9baf-88055ca4c11b-kube-api-access-gcrm4\") on node \"crc\" DevicePath \"\"" Dec 05 23:57:01 crc kubenswrapper[4734]: I1205 23:57:01.124670 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zvx7t" Dec 05 23:57:01 crc kubenswrapper[4734]: I1205 23:57:01.170853 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zvx7t"] Dec 05 23:57:01 crc kubenswrapper[4734]: I1205 23:57:01.181207 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zvx7t"] Dec 05 23:57:01 crc kubenswrapper[4734]: I1205 23:57:01.627318 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2310433-f1f4-4e9b-9baf-88055ca4c11b" path="/var/lib/kubelet/pods/d2310433-f1f4-4e9b-9baf-88055ca4c11b/volumes" Dec 05 23:57:20 crc kubenswrapper[4734]: I1205 23:57:20.444555 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:57:20 crc kubenswrapper[4734]: I1205 23:57:20.445386 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:57:50 crc kubenswrapper[4734]: I1205 23:57:50.444286 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 23:57:50 crc kubenswrapper[4734]: I1205 23:57:50.445073 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 23:57:50 crc kubenswrapper[4734]: I1205 23:57:50.445133 4734 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 05 23:57:50 crc kubenswrapper[4734]: I1205 23:57:50.446181 4734 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762"} pod="openshift-machine-config-operator/machine-config-daemon-vn94d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 23:57:50 crc kubenswrapper[4734]: I1205 23:57:50.446242 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" containerID="cri-o://c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762" gracePeriod=600 Dec 05 23:57:50 crc kubenswrapper[4734]: E1205 23:57:50.580952 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:57:50 crc kubenswrapper[4734]: I1205 23:57:50.667757 4734 generic.go:334] "Generic (PLEG): container finished" podID="65758270-a7a7-46b5-af95-0588daf9fa86" containerID="c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762" exitCode=0 Dec 05 23:57:50 crc kubenswrapper[4734]: I1205 23:57:50.667797 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerDied","Data":"c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762"} Dec 05 23:57:50 crc kubenswrapper[4734]: I1205 23:57:50.667880 4734 scope.go:117] "RemoveContainer" containerID="62ea0ca90a403cb22c1108463dd79f495ead53f2907ab42b91e5688249314f62" Dec 05 23:57:50 crc kubenswrapper[4734]: I1205 23:57:50.669308 4734 scope.go:117] "RemoveContainer" containerID="c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762" Dec 05 23:57:50 crc kubenswrapper[4734]: E1205 23:57:50.669712 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:57:50 crc kubenswrapper[4734]: E1205 23:57:50.711391 4734 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65758270_a7a7_46b5_af95_0588daf9fa86.slice/crio-conmon-c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762.scope\": RecentStats: unable to find data in memory cache]" Dec 05 23:58:05 crc kubenswrapper[4734]: I1205 23:58:05.614372 4734 scope.go:117] "RemoveContainer" containerID="c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762" Dec 05 23:58:05 crc kubenswrapper[4734]: E1205 23:58:05.615264 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:58:18 crc kubenswrapper[4734]: I1205 23:58:18.615061 4734 scope.go:117] "RemoveContainer" containerID="c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762" Dec 05 23:58:18 crc kubenswrapper[4734]: E1205 23:58:18.616135 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:58:29 crc kubenswrapper[4734]: I1205 23:58:29.622363 4734 scope.go:117] "RemoveContainer" containerID="c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762" Dec 05 23:58:29 crc kubenswrapper[4734]: E1205 23:58:29.623591 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:58:43 crc kubenswrapper[4734]: I1205 23:58:43.614350 4734 scope.go:117] "RemoveContainer" containerID="c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762" Dec 05 23:58:43 crc kubenswrapper[4734]: E1205 23:58:43.617151 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:58:54 crc kubenswrapper[4734]: I1205 23:58:54.615154 4734 scope.go:117] "RemoveContainer" containerID="c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762" Dec 05 23:58:54 crc kubenswrapper[4734]: E1205 23:58:54.616470 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:59:09 crc kubenswrapper[4734]: I1205 23:59:09.623018 4734 scope.go:117] "RemoveContainer" containerID="c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762" Dec 05 23:59:09 crc kubenswrapper[4734]: E1205 23:59:09.624168 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:59:21 crc kubenswrapper[4734]: I1205 23:59:21.424861 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5x4m6"] Dec 05 23:59:21 crc kubenswrapper[4734]: E1205 23:59:21.426172 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2310433-f1f4-4e9b-9baf-88055ca4c11b" containerName="extract-utilities" Dec 05 23:59:21 crc kubenswrapper[4734]: I1205 23:59:21.426189 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2310433-f1f4-4e9b-9baf-88055ca4c11b" containerName="extract-utilities" Dec 05 23:59:21 crc kubenswrapper[4734]: E1205 23:59:21.426219 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2310433-f1f4-4e9b-9baf-88055ca4c11b" containerName="registry-server" Dec 05 23:59:21 crc kubenswrapper[4734]: I1205 23:59:21.426227 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2310433-f1f4-4e9b-9baf-88055ca4c11b" containerName="registry-server" Dec 05 23:59:21 crc kubenswrapper[4734]: E1205 23:59:21.426248 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2310433-f1f4-4e9b-9baf-88055ca4c11b" containerName="extract-content" Dec 05 23:59:21 crc kubenswrapper[4734]: I1205 23:59:21.426254 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2310433-f1f4-4e9b-9baf-88055ca4c11b" containerName="extract-content" Dec 05 23:59:21 crc kubenswrapper[4734]: I1205 23:59:21.426591 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2310433-f1f4-4e9b-9baf-88055ca4c11b" containerName="registry-server" Dec 05 23:59:21 crc kubenswrapper[4734]: I1205 23:59:21.428263 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5x4m6" Dec 05 23:59:21 crc kubenswrapper[4734]: I1205 23:59:21.436083 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5x4m6"] Dec 05 23:59:21 crc kubenswrapper[4734]: I1205 23:59:21.545106 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6030d9f5-98c6-40a2-ad78-5f9463e64cc7-utilities\") pod \"community-operators-5x4m6\" (UID: \"6030d9f5-98c6-40a2-ad78-5f9463e64cc7\") " pod="openshift-marketplace/community-operators-5x4m6" Dec 05 23:59:21 crc kubenswrapper[4734]: I1205 23:59:21.545490 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6030d9f5-98c6-40a2-ad78-5f9463e64cc7-catalog-content\") pod \"community-operators-5x4m6\" (UID: \"6030d9f5-98c6-40a2-ad78-5f9463e64cc7\") " pod="openshift-marketplace/community-operators-5x4m6" Dec 05 23:59:21 crc kubenswrapper[4734]: I1205 23:59:21.545645 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgzth\" (UniqueName: \"kubernetes.io/projected/6030d9f5-98c6-40a2-ad78-5f9463e64cc7-kube-api-access-cgzth\") pod \"community-operators-5x4m6\" (UID: \"6030d9f5-98c6-40a2-ad78-5f9463e64cc7\") " pod="openshift-marketplace/community-operators-5x4m6" Dec 05 23:59:21 crc kubenswrapper[4734]: I1205 23:59:21.647484 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6030d9f5-98c6-40a2-ad78-5f9463e64cc7-utilities\") pod \"community-operators-5x4m6\" (UID: \"6030d9f5-98c6-40a2-ad78-5f9463e64cc7\") " pod="openshift-marketplace/community-operators-5x4m6" Dec 05 23:59:21 crc kubenswrapper[4734]: I1205 23:59:21.647560 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6030d9f5-98c6-40a2-ad78-5f9463e64cc7-catalog-content\") pod \"community-operators-5x4m6\" (UID: \"6030d9f5-98c6-40a2-ad78-5f9463e64cc7\") " pod="openshift-marketplace/community-operators-5x4m6" Dec 05 23:59:21 crc kubenswrapper[4734]: I1205 23:59:21.647617 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgzth\" (UniqueName: \"kubernetes.io/projected/6030d9f5-98c6-40a2-ad78-5f9463e64cc7-kube-api-access-cgzth\") pod \"community-operators-5x4m6\" (UID: \"6030d9f5-98c6-40a2-ad78-5f9463e64cc7\") " pod="openshift-marketplace/community-operators-5x4m6" Dec 05 23:59:21 crc kubenswrapper[4734]: I1205 23:59:21.648976 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6030d9f5-98c6-40a2-ad78-5f9463e64cc7-utilities\") pod \"community-operators-5x4m6\" (UID: \"6030d9f5-98c6-40a2-ad78-5f9463e64cc7\") " pod="openshift-marketplace/community-operators-5x4m6" Dec 05 23:59:21 crc kubenswrapper[4734]: I1205 23:59:21.650505 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6030d9f5-98c6-40a2-ad78-5f9463e64cc7-catalog-content\") pod \"community-operators-5x4m6\" (UID: \"6030d9f5-98c6-40a2-ad78-5f9463e64cc7\") " pod="openshift-marketplace/community-operators-5x4m6" Dec 05 23:59:21 crc kubenswrapper[4734]: I1205 23:59:21.672811 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgzth\" (UniqueName: \"kubernetes.io/projected/6030d9f5-98c6-40a2-ad78-5f9463e64cc7-kube-api-access-cgzth\") pod \"community-operators-5x4m6\" (UID: \"6030d9f5-98c6-40a2-ad78-5f9463e64cc7\") " pod="openshift-marketplace/community-operators-5x4m6" Dec 05 23:59:21 crc kubenswrapper[4734]: I1205 23:59:21.755382 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5x4m6" Dec 05 23:59:22 crc kubenswrapper[4734]: I1205 23:59:22.346601 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5x4m6"] Dec 05 23:59:22 crc kubenswrapper[4734]: I1205 23:59:22.614607 4734 scope.go:117] "RemoveContainer" containerID="c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762" Dec 05 23:59:22 crc kubenswrapper[4734]: E1205 23:59:22.614864 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:59:22 crc kubenswrapper[4734]: I1205 23:59:22.715602 4734 generic.go:334] "Generic (PLEG): container finished" podID="6030d9f5-98c6-40a2-ad78-5f9463e64cc7" containerID="bfafffcba50336503f4baf1b0404039533ad0c6e2f40a82f6ca7f2c90f56ea94" exitCode=0 Dec 05 23:59:22 crc kubenswrapper[4734]: I1205 23:59:22.715682 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5x4m6" event={"ID":"6030d9f5-98c6-40a2-ad78-5f9463e64cc7","Type":"ContainerDied","Data":"bfafffcba50336503f4baf1b0404039533ad0c6e2f40a82f6ca7f2c90f56ea94"} Dec 05 23:59:22 crc kubenswrapper[4734]: I1205 23:59:22.715728 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5x4m6" event={"ID":"6030d9f5-98c6-40a2-ad78-5f9463e64cc7","Type":"ContainerStarted","Data":"51cf90abdf39e638f42ec3350ab11f5f0e017214d6c1953afdd99a92b10b73c9"} Dec 05 23:59:22 crc kubenswrapper[4734]: I1205 23:59:22.719906 4734 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 23:59:23 crc kubenswrapper[4734]: I1205 23:59:23.728941 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5x4m6" event={"ID":"6030d9f5-98c6-40a2-ad78-5f9463e64cc7","Type":"ContainerStarted","Data":"f7895920be14231d98503a12bd47d0459f5b74c2608332fa2f12cac0f43f70a0"} Dec 05 23:59:24 crc kubenswrapper[4734]: I1205 23:59:24.740919 4734 generic.go:334] "Generic (PLEG): container finished" podID="6030d9f5-98c6-40a2-ad78-5f9463e64cc7" containerID="f7895920be14231d98503a12bd47d0459f5b74c2608332fa2f12cac0f43f70a0" exitCode=0 Dec 05 23:59:24 crc kubenswrapper[4734]: I1205 23:59:24.741115 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5x4m6" event={"ID":"6030d9f5-98c6-40a2-ad78-5f9463e64cc7","Type":"ContainerDied","Data":"f7895920be14231d98503a12bd47d0459f5b74c2608332fa2f12cac0f43f70a0"} Dec 05 23:59:25 crc kubenswrapper[4734]: I1205 23:59:25.755360 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5x4m6" event={"ID":"6030d9f5-98c6-40a2-ad78-5f9463e64cc7","Type":"ContainerStarted","Data":"e335180af6b28c98fd95f5f5ff65dde5fe112ddccded46deeb3bec681a7d307b"} Dec 05 23:59:25 crc kubenswrapper[4734]: I1205 23:59:25.779845 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5x4m6" podStartSLOduration=2.30729707 podStartE2EDuration="4.779822994s" podCreationTimestamp="2025-12-05 23:59:21 +0000 UTC" firstStartedPulling="2025-12-05 23:59:22.719454827 +0000 UTC m=+2383.402859103" lastFinishedPulling="2025-12-05 23:59:25.191980751 +0000 UTC m=+2385.875385027" observedRunningTime="2025-12-05 23:59:25.778566413 +0000 UTC m=+2386.461970689" watchObservedRunningTime="2025-12-05 23:59:25.779822994 +0000 UTC m=+2386.463227260" Dec 05 23:59:31 crc kubenswrapper[4734]: I1205 23:59:31.755889 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5x4m6" Dec 05 23:59:31 crc kubenswrapper[4734]: I1205 23:59:31.756811 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5x4m6" Dec 05 23:59:31 crc kubenswrapper[4734]: I1205 23:59:31.811244 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5x4m6" Dec 05 23:59:31 crc kubenswrapper[4734]: I1205 23:59:31.881677 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5x4m6" Dec 05 23:59:32 crc kubenswrapper[4734]: I1205 23:59:32.807172 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5x4m6"] Dec 05 23:59:33 crc kubenswrapper[4734]: I1205 23:59:33.842370 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5x4m6" podUID="6030d9f5-98c6-40a2-ad78-5f9463e64cc7" containerName="registry-server" containerID="cri-o://e335180af6b28c98fd95f5f5ff65dde5fe112ddccded46deeb3bec681a7d307b" gracePeriod=2 Dec 05 23:59:34 crc kubenswrapper[4734]: I1205 23:59:34.410199 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5x4m6" Dec 05 23:59:34 crc kubenswrapper[4734]: I1205 23:59:34.579170 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6030d9f5-98c6-40a2-ad78-5f9463e64cc7-catalog-content\") pod \"6030d9f5-98c6-40a2-ad78-5f9463e64cc7\" (UID: \"6030d9f5-98c6-40a2-ad78-5f9463e64cc7\") " Dec 05 23:59:34 crc kubenswrapper[4734]: I1205 23:59:34.579254 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgzth\" (UniqueName: \"kubernetes.io/projected/6030d9f5-98c6-40a2-ad78-5f9463e64cc7-kube-api-access-cgzth\") pod \"6030d9f5-98c6-40a2-ad78-5f9463e64cc7\" (UID: \"6030d9f5-98c6-40a2-ad78-5f9463e64cc7\") " Dec 05 23:59:34 crc kubenswrapper[4734]: I1205 23:59:34.579388 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6030d9f5-98c6-40a2-ad78-5f9463e64cc7-utilities\") pod \"6030d9f5-98c6-40a2-ad78-5f9463e64cc7\" (UID: \"6030d9f5-98c6-40a2-ad78-5f9463e64cc7\") " Dec 05 23:59:34 crc kubenswrapper[4734]: I1205 23:59:34.580705 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6030d9f5-98c6-40a2-ad78-5f9463e64cc7-utilities" (OuterVolumeSpecName: "utilities") pod "6030d9f5-98c6-40a2-ad78-5f9463e64cc7" (UID: "6030d9f5-98c6-40a2-ad78-5f9463e64cc7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:59:34 crc kubenswrapper[4734]: I1205 23:59:34.587069 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6030d9f5-98c6-40a2-ad78-5f9463e64cc7-kube-api-access-cgzth" (OuterVolumeSpecName: "kube-api-access-cgzth") pod "6030d9f5-98c6-40a2-ad78-5f9463e64cc7" (UID: "6030d9f5-98c6-40a2-ad78-5f9463e64cc7"). InnerVolumeSpecName "kube-api-access-cgzth". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 23:59:34 crc kubenswrapper[4734]: I1205 23:59:34.640347 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6030d9f5-98c6-40a2-ad78-5f9463e64cc7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6030d9f5-98c6-40a2-ad78-5f9463e64cc7" (UID: "6030d9f5-98c6-40a2-ad78-5f9463e64cc7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 23:59:34 crc kubenswrapper[4734]: I1205 23:59:34.682872 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6030d9f5-98c6-40a2-ad78-5f9463e64cc7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 23:59:34 crc kubenswrapper[4734]: I1205 23:59:34.682915 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgzth\" (UniqueName: \"kubernetes.io/projected/6030d9f5-98c6-40a2-ad78-5f9463e64cc7-kube-api-access-cgzth\") on node \"crc\" DevicePath \"\"" Dec 05 23:59:34 crc kubenswrapper[4734]: I1205 23:59:34.682925 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6030d9f5-98c6-40a2-ad78-5f9463e64cc7-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 23:59:34 crc kubenswrapper[4734]: I1205 23:59:34.854954 4734 generic.go:334] "Generic (PLEG): container finished" podID="6030d9f5-98c6-40a2-ad78-5f9463e64cc7" containerID="e335180af6b28c98fd95f5f5ff65dde5fe112ddccded46deeb3bec681a7d307b" exitCode=0 Dec 05 23:59:34 crc kubenswrapper[4734]: I1205 23:59:34.855009 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5x4m6" event={"ID":"6030d9f5-98c6-40a2-ad78-5f9463e64cc7","Type":"ContainerDied","Data":"e335180af6b28c98fd95f5f5ff65dde5fe112ddccded46deeb3bec681a7d307b"} Dec 05 23:59:34 crc kubenswrapper[4734]: I1205 23:59:34.855033 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5x4m6" Dec 05 23:59:34 crc kubenswrapper[4734]: I1205 23:59:34.855054 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5x4m6" event={"ID":"6030d9f5-98c6-40a2-ad78-5f9463e64cc7","Type":"ContainerDied","Data":"51cf90abdf39e638f42ec3350ab11f5f0e017214d6c1953afdd99a92b10b73c9"} Dec 05 23:59:34 crc kubenswrapper[4734]: I1205 23:59:34.855075 4734 scope.go:117] "RemoveContainer" containerID="e335180af6b28c98fd95f5f5ff65dde5fe112ddccded46deeb3bec681a7d307b" Dec 05 23:59:34 crc kubenswrapper[4734]: I1205 23:59:34.888698 4734 scope.go:117] "RemoveContainer" containerID="f7895920be14231d98503a12bd47d0459f5b74c2608332fa2f12cac0f43f70a0" Dec 05 23:59:34 crc kubenswrapper[4734]: I1205 23:59:34.904370 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5x4m6"] Dec 05 23:59:34 crc kubenswrapper[4734]: I1205 23:59:34.912817 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5x4m6"] Dec 05 23:59:34 crc kubenswrapper[4734]: I1205 23:59:34.941141 4734 scope.go:117] "RemoveContainer" containerID="bfafffcba50336503f4baf1b0404039533ad0c6e2f40a82f6ca7f2c90f56ea94" Dec 05 23:59:34 crc kubenswrapper[4734]: I1205 23:59:34.977391 4734 scope.go:117] "RemoveContainer" containerID="e335180af6b28c98fd95f5f5ff65dde5fe112ddccded46deeb3bec681a7d307b" Dec 05 23:59:34 crc kubenswrapper[4734]: E1205 23:59:34.978750 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e335180af6b28c98fd95f5f5ff65dde5fe112ddccded46deeb3bec681a7d307b\": container with ID starting with e335180af6b28c98fd95f5f5ff65dde5fe112ddccded46deeb3bec681a7d307b not found: ID does not exist" containerID="e335180af6b28c98fd95f5f5ff65dde5fe112ddccded46deeb3bec681a7d307b" Dec 05 23:59:34 crc kubenswrapper[4734]: I1205 23:59:34.978803 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e335180af6b28c98fd95f5f5ff65dde5fe112ddccded46deeb3bec681a7d307b"} err="failed to get container status \"e335180af6b28c98fd95f5f5ff65dde5fe112ddccded46deeb3bec681a7d307b\": rpc error: code = NotFound desc = could not find container \"e335180af6b28c98fd95f5f5ff65dde5fe112ddccded46deeb3bec681a7d307b\": container with ID starting with e335180af6b28c98fd95f5f5ff65dde5fe112ddccded46deeb3bec681a7d307b not found: ID does not exist" Dec 05 23:59:34 crc kubenswrapper[4734]: I1205 23:59:34.978840 4734 scope.go:117] "RemoveContainer" containerID="f7895920be14231d98503a12bd47d0459f5b74c2608332fa2f12cac0f43f70a0" Dec 05 23:59:34 crc kubenswrapper[4734]: E1205 23:59:34.980265 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7895920be14231d98503a12bd47d0459f5b74c2608332fa2f12cac0f43f70a0\": container with ID starting with f7895920be14231d98503a12bd47d0459f5b74c2608332fa2f12cac0f43f70a0 not found: ID does not exist" containerID="f7895920be14231d98503a12bd47d0459f5b74c2608332fa2f12cac0f43f70a0" Dec 05 23:59:34 crc kubenswrapper[4734]: I1205 23:59:34.980292 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7895920be14231d98503a12bd47d0459f5b74c2608332fa2f12cac0f43f70a0"} err="failed to get container status \"f7895920be14231d98503a12bd47d0459f5b74c2608332fa2f12cac0f43f70a0\": rpc error: code = NotFound desc = could not find container \"f7895920be14231d98503a12bd47d0459f5b74c2608332fa2f12cac0f43f70a0\": container with ID starting with f7895920be14231d98503a12bd47d0459f5b74c2608332fa2f12cac0f43f70a0 not found: ID does not exist" Dec 05 23:59:34 crc kubenswrapper[4734]: I1205 23:59:34.980308 4734 scope.go:117] "RemoveContainer" containerID="bfafffcba50336503f4baf1b0404039533ad0c6e2f40a82f6ca7f2c90f56ea94" Dec 05 23:59:34 crc kubenswrapper[4734]: E1205 23:59:34.980847 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfafffcba50336503f4baf1b0404039533ad0c6e2f40a82f6ca7f2c90f56ea94\": container with ID starting with bfafffcba50336503f4baf1b0404039533ad0c6e2f40a82f6ca7f2c90f56ea94 not found: ID does not exist" containerID="bfafffcba50336503f4baf1b0404039533ad0c6e2f40a82f6ca7f2c90f56ea94" Dec 05 23:59:34 crc kubenswrapper[4734]: I1205 23:59:34.980915 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfafffcba50336503f4baf1b0404039533ad0c6e2f40a82f6ca7f2c90f56ea94"} err="failed to get container status \"bfafffcba50336503f4baf1b0404039533ad0c6e2f40a82f6ca7f2c90f56ea94\": rpc error: code = NotFound desc = could not find container \"bfafffcba50336503f4baf1b0404039533ad0c6e2f40a82f6ca7f2c90f56ea94\": container with ID starting with bfafffcba50336503f4baf1b0404039533ad0c6e2f40a82f6ca7f2c90f56ea94 not found: ID does not exist" Dec 05 23:59:35 crc kubenswrapper[4734]: I1205 23:59:35.614974 4734 scope.go:117] "RemoveContainer" containerID="c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762" Dec 05 23:59:35 crc kubenswrapper[4734]: E1205 23:59:35.616196 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 05 23:59:35 crc kubenswrapper[4734]: I1205 23:59:35.627353 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6030d9f5-98c6-40a2-ad78-5f9463e64cc7" path="/var/lib/kubelet/pods/6030d9f5-98c6-40a2-ad78-5f9463e64cc7/volumes" Dec 05 23:59:48 crc kubenswrapper[4734]: I1205 23:59:48.614740 4734 scope.go:117] "RemoveContainer" containerID="c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762" Dec 05 23:59:48 crc kubenswrapper[4734]: E1205 23:59:48.615926 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.177148 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416320-9knj9"] Dec 06 00:00:00 crc kubenswrapper[4734]: E1206 00:00:00.178663 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6030d9f5-98c6-40a2-ad78-5f9463e64cc7" containerName="extract-utilities" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.178687 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="6030d9f5-98c6-40a2-ad78-5f9463e64cc7" containerName="extract-utilities" Dec 06 00:00:00 crc kubenswrapper[4734]: E1206 00:00:00.178728 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6030d9f5-98c6-40a2-ad78-5f9463e64cc7" containerName="registry-server" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.178735 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="6030d9f5-98c6-40a2-ad78-5f9463e64cc7" containerName="registry-server" Dec 06 00:00:00 crc kubenswrapper[4734]: E1206 00:00:00.178755 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6030d9f5-98c6-40a2-ad78-5f9463e64cc7" containerName="extract-content" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.178764 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="6030d9f5-98c6-40a2-ad78-5f9463e64cc7" containerName="extract-content" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.178983 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="6030d9f5-98c6-40a2-ad78-5f9463e64cc7" containerName="registry-server" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.180018 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416320-9knj9" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.185228 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.185610 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.192377 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29416320-lkmf8"] Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.194740 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29416320-lkmf8" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.205357 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.205847 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.215594 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-purge-29416320-kjz69"] Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.225070 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-purge-29416320-kjz69" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.227774 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.262345 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-purge-29416320-x2nsf"] Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.269208 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-purge-29416320-x2nsf" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.275136 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.275159 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-purge-29416320-x2nsf"] Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.296642 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqxh5\" (UniqueName: \"kubernetes.io/projected/af05c4c3-513e-45b8-8eda-3eecdbb6561a-kube-api-access-hqxh5\") pod \"image-pruner-29416320-lkmf8\" (UID: \"af05c4c3-513e-45b8-8eda-3eecdbb6561a\") " pod="openshift-image-registry/image-pruner-29416320-lkmf8" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.296738 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd-config-volume\") pod \"collect-profiles-29416320-9knj9\" (UID: \"1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416320-9knj9" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.296829 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj49l\" (UniqueName: \"kubernetes.io/projected/1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd-kube-api-access-tj49l\") pod \"collect-profiles-29416320-9knj9\" (UID: \"1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416320-9knj9" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.297102 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/af05c4c3-513e-45b8-8eda-3eecdbb6561a-serviceca\") pod \"image-pruner-29416320-lkmf8\" (UID: \"af05c4c3-513e-45b8-8eda-3eecdbb6561a\") " pod="openshift-image-registry/image-pruner-29416320-lkmf8" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.297216 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd-secret-volume\") pod \"collect-profiles-29416320-9knj9\" (UID: \"1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416320-9knj9" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.322678 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416320-9knj9"] Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.337710 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29416320-lkmf8"] Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.350943 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-purge-29416320-kjz69"] Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.400002 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/af05c4c3-513e-45b8-8eda-3eecdbb6561a-serviceca\") pod \"image-pruner-29416320-lkmf8\" (UID: \"af05c4c3-513e-45b8-8eda-3eecdbb6561a\") " pod="openshift-image-registry/image-pruner-29416320-lkmf8" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.400067 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5-scripts\") pod \"nova-cell0-db-purge-29416320-kjz69\" (UID: \"e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5\") " pod="openstack/nova-cell0-db-purge-29416320-kjz69" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.400124 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd-secret-volume\") pod \"collect-profiles-29416320-9knj9\" (UID: \"1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416320-9knj9" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.400211 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d498a8e-4ace-4a26-9c32-2dbc411c0b50-scripts\") pod \"nova-cell1-db-purge-29416320-x2nsf\" (UID: \"1d498a8e-4ace-4a26-9c32-2dbc411c0b50\") " pod="openstack/nova-cell1-db-purge-29416320-x2nsf" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.400239 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5-combined-ca-bundle\") pod \"nova-cell0-db-purge-29416320-kjz69\" (UID: \"e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5\") " pod="openstack/nova-cell0-db-purge-29416320-kjz69" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.400265 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d498a8e-4ace-4a26-9c32-2dbc411c0b50-config-data\") pod \"nova-cell1-db-purge-29416320-x2nsf\" (UID: \"1d498a8e-4ace-4a26-9c32-2dbc411c0b50\") " pod="openstack/nova-cell1-db-purge-29416320-x2nsf" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.400305 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqxh5\" (UniqueName: \"kubernetes.io/projected/af05c4c3-513e-45b8-8eda-3eecdbb6561a-kube-api-access-hqxh5\") pod \"image-pruner-29416320-lkmf8\" (UID: \"af05c4c3-513e-45b8-8eda-3eecdbb6561a\") " pod="openshift-image-registry/image-pruner-29416320-lkmf8" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.400338 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd-config-volume\") pod \"collect-profiles-29416320-9knj9\" (UID: \"1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416320-9knj9" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.400372 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5-config-data\") pod \"nova-cell0-db-purge-29416320-kjz69\" (UID: \"e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5\") " pod="openstack/nova-cell0-db-purge-29416320-kjz69" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.400401 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj49l\" (UniqueName: \"kubernetes.io/projected/1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd-kube-api-access-tj49l\") pod \"collect-profiles-29416320-9knj9\" (UID: \"1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416320-9knj9" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.400581 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d498a8e-4ace-4a26-9c32-2dbc411c0b50-combined-ca-bundle\") pod \"nova-cell1-db-purge-29416320-x2nsf\" (UID: \"1d498a8e-4ace-4a26-9c32-2dbc411c0b50\") " pod="openstack/nova-cell1-db-purge-29416320-x2nsf" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.400734 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68mx4\" (UniqueName: \"kubernetes.io/projected/e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5-kube-api-access-68mx4\") pod \"nova-cell0-db-purge-29416320-kjz69\" (UID: \"e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5\") " pod="openstack/nova-cell0-db-purge-29416320-kjz69" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.400789 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr7l8\" (UniqueName: \"kubernetes.io/projected/1d498a8e-4ace-4a26-9c32-2dbc411c0b50-kube-api-access-rr7l8\") pod \"nova-cell1-db-purge-29416320-x2nsf\" (UID: \"1d498a8e-4ace-4a26-9c32-2dbc411c0b50\") " pod="openstack/nova-cell1-db-purge-29416320-x2nsf" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.402219 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd-config-volume\") pod \"collect-profiles-29416320-9knj9\" (UID: \"1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416320-9knj9" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.402847 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/af05c4c3-513e-45b8-8eda-3eecdbb6561a-serviceca\") pod \"image-pruner-29416320-lkmf8\" (UID: \"af05c4c3-513e-45b8-8eda-3eecdbb6561a\") " pod="openshift-image-registry/image-pruner-29416320-lkmf8" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.414706 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd-secret-volume\") pod \"collect-profiles-29416320-9knj9\" (UID: \"1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416320-9knj9" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.425266 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj49l\" (UniqueName: \"kubernetes.io/projected/1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd-kube-api-access-tj49l\") pod \"collect-profiles-29416320-9knj9\" (UID: \"1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416320-9knj9" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.425671 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqxh5\" (UniqueName: \"kubernetes.io/projected/af05c4c3-513e-45b8-8eda-3eecdbb6561a-kube-api-access-hqxh5\") pod \"image-pruner-29416320-lkmf8\" (UID: \"af05c4c3-513e-45b8-8eda-3eecdbb6561a\") " pod="openshift-image-registry/image-pruner-29416320-lkmf8" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.503636 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d498a8e-4ace-4a26-9c32-2dbc411c0b50-scripts\") pod \"nova-cell1-db-purge-29416320-x2nsf\" (UID: \"1d498a8e-4ace-4a26-9c32-2dbc411c0b50\") " pod="openstack/nova-cell1-db-purge-29416320-x2nsf" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.503705 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5-combined-ca-bundle\") pod \"nova-cell0-db-purge-29416320-kjz69\" (UID: \"e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5\") " pod="openstack/nova-cell0-db-purge-29416320-kjz69" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.503737 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d498a8e-4ace-4a26-9c32-2dbc411c0b50-config-data\") pod \"nova-cell1-db-purge-29416320-x2nsf\" (UID: \"1d498a8e-4ace-4a26-9c32-2dbc411c0b50\") " pod="openstack/nova-cell1-db-purge-29416320-x2nsf" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.503795 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5-config-data\") pod \"nova-cell0-db-purge-29416320-kjz69\" (UID: \"e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5\") " pod="openstack/nova-cell0-db-purge-29416320-kjz69" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.503833 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d498a8e-4ace-4a26-9c32-2dbc411c0b50-combined-ca-bundle\") pod \"nova-cell1-db-purge-29416320-x2nsf\" (UID: \"1d498a8e-4ace-4a26-9c32-2dbc411c0b50\") " pod="openstack/nova-cell1-db-purge-29416320-x2nsf" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.503872 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68mx4\" (UniqueName: \"kubernetes.io/projected/e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5-kube-api-access-68mx4\") pod \"nova-cell0-db-purge-29416320-kjz69\" (UID: \"e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5\") " pod="openstack/nova-cell0-db-purge-29416320-kjz69" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.503902 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr7l8\" (UniqueName: \"kubernetes.io/projected/1d498a8e-4ace-4a26-9c32-2dbc411c0b50-kube-api-access-rr7l8\") pod \"nova-cell1-db-purge-29416320-x2nsf\" (UID: \"1d498a8e-4ace-4a26-9c32-2dbc411c0b50\") " pod="openstack/nova-cell1-db-purge-29416320-x2nsf" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.503963 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5-scripts\") pod \"nova-cell0-db-purge-29416320-kjz69\" (UID: \"e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5\") " pod="openstack/nova-cell0-db-purge-29416320-kjz69" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.507953 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5-scripts\") pod \"nova-cell0-db-purge-29416320-kjz69\" (UID: \"e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5\") " pod="openstack/nova-cell0-db-purge-29416320-kjz69" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.516065 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d498a8e-4ace-4a26-9c32-2dbc411c0b50-scripts\") pod \"nova-cell1-db-purge-29416320-x2nsf\" (UID: \"1d498a8e-4ace-4a26-9c32-2dbc411c0b50\") " pod="openstack/nova-cell1-db-purge-29416320-x2nsf" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.518309 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d498a8e-4ace-4a26-9c32-2dbc411c0b50-combined-ca-bundle\") pod \"nova-cell1-db-purge-29416320-x2nsf\" (UID: \"1d498a8e-4ace-4a26-9c32-2dbc411c0b50\") " pod="openstack/nova-cell1-db-purge-29416320-x2nsf" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.518422 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5-combined-ca-bundle\") pod \"nova-cell0-db-purge-29416320-kjz69\" (UID: \"e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5\") " pod="openstack/nova-cell0-db-purge-29416320-kjz69" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.521887 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5-config-data\") pod \"nova-cell0-db-purge-29416320-kjz69\" (UID: \"e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5\") " pod="openstack/nova-cell0-db-purge-29416320-kjz69" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.527572 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d498a8e-4ace-4a26-9c32-2dbc411c0b50-config-data\") pod \"nova-cell1-db-purge-29416320-x2nsf\" (UID: \"1d498a8e-4ace-4a26-9c32-2dbc411c0b50\") " pod="openstack/nova-cell1-db-purge-29416320-x2nsf" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.530937 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68mx4\" (UniqueName: \"kubernetes.io/projected/e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5-kube-api-access-68mx4\") pod \"nova-cell0-db-purge-29416320-kjz69\" (UID: \"e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5\") " pod="openstack/nova-cell0-db-purge-29416320-kjz69" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.533641 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr7l8\" (UniqueName: \"kubernetes.io/projected/1d498a8e-4ace-4a26-9c32-2dbc411c0b50-kube-api-access-rr7l8\") pod \"nova-cell1-db-purge-29416320-x2nsf\" (UID: \"1d498a8e-4ace-4a26-9c32-2dbc411c0b50\") " pod="openstack/nova-cell1-db-purge-29416320-x2nsf" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.579717 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416320-9knj9" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.628161 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29416320-lkmf8" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.645708 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-purge-29416320-kjz69" Dec 06 00:00:00 crc kubenswrapper[4734]: I1206 00:00:00.656642 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-purge-29416320-x2nsf" Dec 06 00:00:01 crc kubenswrapper[4734]: I1206 00:00:01.250440 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29416320-lkmf8"] Dec 06 00:00:01 crc kubenswrapper[4734]: W1206 00:00:01.269708 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf05c4c3_513e_45b8_8eda_3eecdbb6561a.slice/crio-6b3911cd7da80982b839e59987f36a7c3c9012b189f588a93ff4af88eaaa2f13 WatchSource:0}: Error finding container 6b3911cd7da80982b839e59987f36a7c3c9012b189f588a93ff4af88eaaa2f13: Status 404 returned error can't find the container with id 6b3911cd7da80982b839e59987f36a7c3c9012b189f588a93ff4af88eaaa2f13 Dec 06 00:00:01 crc kubenswrapper[4734]: W1206 00:00:01.272981 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ba50a7d_69d9_4a30_8fbf_b3ef99257ebd.slice/crio-053f7226f5f9ffe31ccde492bea3e6f489dd750fd5badc5a4f2ce43c1a185697 WatchSource:0}: Error finding container 053f7226f5f9ffe31ccde492bea3e6f489dd750fd5badc5a4f2ce43c1a185697: Status 404 returned error can't find the container with id 053f7226f5f9ffe31ccde492bea3e6f489dd750fd5badc5a4f2ce43c1a185697 Dec 06 00:00:01 crc kubenswrapper[4734]: I1206 00:00:01.282220 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416320-9knj9"] Dec 06 00:00:01 crc kubenswrapper[4734]: I1206 00:00:01.456897 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-purge-29416320-x2nsf"] Dec 06 00:00:01 crc kubenswrapper[4734]: I1206 00:00:01.610429 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-purge-29416320-kjz69"] Dec 06 00:00:01 crc kubenswrapper[4734]: I1206 00:00:01.615118 4734 scope.go:117] "RemoveContainer" containerID="c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762" Dec 06 00:00:01 crc kubenswrapper[4734]: E1206 00:00:01.615451 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:00:01 crc kubenswrapper[4734]: W1206 00:00:01.627212 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0efe4eb_7e90_4ea7_8c6a_d3c95c3845a5.slice/crio-4ef27b3626906a70ecf00bc9bb4b21ef8a1f4723e0ebb052de842705aad47330 WatchSource:0}: Error finding container 4ef27b3626906a70ecf00bc9bb4b21ef8a1f4723e0ebb052de842705aad47330: Status 404 returned error can't find the container with id 4ef27b3626906a70ecf00bc9bb4b21ef8a1f4723e0ebb052de842705aad47330 Dec 06 00:00:02 crc kubenswrapper[4734]: I1206 00:00:02.177187 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29416320-lkmf8" event={"ID":"af05c4c3-513e-45b8-8eda-3eecdbb6561a","Type":"ContainerStarted","Data":"e074bec709885c75af024495a517b478aaeca18d581029ebaaba39f6f42def5b"} Dec 06 00:00:02 crc kubenswrapper[4734]: I1206 00:00:02.178344 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29416320-lkmf8" event={"ID":"af05c4c3-513e-45b8-8eda-3eecdbb6561a","Type":"ContainerStarted","Data":"6b3911cd7da80982b839e59987f36a7c3c9012b189f588a93ff4af88eaaa2f13"} Dec 06 00:00:02 crc kubenswrapper[4734]: I1206 00:00:02.190867 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416320-9knj9" event={"ID":"1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd","Type":"ContainerStarted","Data":"677566d7b75b7943a993c250a5b4c9628608bb41d62a393222a2db4241817f35"} Dec 06 00:00:02 crc kubenswrapper[4734]: I1206 00:00:02.193662 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416320-9knj9" event={"ID":"1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd","Type":"ContainerStarted","Data":"053f7226f5f9ffe31ccde492bea3e6f489dd750fd5badc5a4f2ce43c1a185697"} Dec 06 00:00:02 crc kubenswrapper[4734]: I1206 00:00:02.206114 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-purge-29416320-x2nsf" event={"ID":"1d498a8e-4ace-4a26-9c32-2dbc411c0b50","Type":"ContainerStarted","Data":"a6afed96214f62074d45fbb56944e1e7b62479278d49b5d609c452ce3c408a25"} Dec 06 00:00:02 crc kubenswrapper[4734]: I1206 00:00:02.206195 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-purge-29416320-x2nsf" event={"ID":"1d498a8e-4ace-4a26-9c32-2dbc411c0b50","Type":"ContainerStarted","Data":"6c3912d015c1ddb7720f7682f58c302fbb9d94a69b8e28020739c644776a63fc"} Dec 06 00:00:02 crc kubenswrapper[4734]: I1206 00:00:02.208405 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-purge-29416320-kjz69" event={"ID":"e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5","Type":"ContainerStarted","Data":"b3dfba4e7a23363af9ec2267dfa176e26ca3b4e2843314968f4ac0d754f9f85c"} Dec 06 00:00:02 crc kubenswrapper[4734]: I1206 00:00:02.208456 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-purge-29416320-kjz69" event={"ID":"e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5","Type":"ContainerStarted","Data":"4ef27b3626906a70ecf00bc9bb4b21ef8a1f4723e0ebb052de842705aad47330"} Dec 06 00:00:02 crc kubenswrapper[4734]: I1206 00:00:02.212801 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29416320-lkmf8" podStartSLOduration=2.212775399 podStartE2EDuration="2.212775399s" podCreationTimestamp="2025-12-06 00:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 00:00:02.203329088 +0000 UTC m=+2422.886733354" watchObservedRunningTime="2025-12-06 00:00:02.212775399 +0000 UTC m=+2422.896179675" Dec 06 00:00:03 crc kubenswrapper[4734]: I1206 00:00:03.220833 4734 generic.go:334] "Generic (PLEG): container finished" podID="af05c4c3-513e-45b8-8eda-3eecdbb6561a" containerID="e074bec709885c75af024495a517b478aaeca18d581029ebaaba39f6f42def5b" exitCode=0 Dec 06 00:00:03 crc kubenswrapper[4734]: I1206 00:00:03.221424 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29416320-lkmf8" event={"ID":"af05c4c3-513e-45b8-8eda-3eecdbb6561a","Type":"ContainerDied","Data":"e074bec709885c75af024495a517b478aaeca18d581029ebaaba39f6f42def5b"} Dec 06 00:00:03 crc kubenswrapper[4734]: I1206 00:00:03.224472 4734 generic.go:334] "Generic (PLEG): container finished" podID="1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd" containerID="677566d7b75b7943a993c250a5b4c9628608bb41d62a393222a2db4241817f35" exitCode=0 Dec 06 00:00:03 crc kubenswrapper[4734]: I1206 00:00:03.224671 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416320-9knj9" event={"ID":"1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd","Type":"ContainerDied","Data":"677566d7b75b7943a993c250a5b4c9628608bb41d62a393222a2db4241817f35"} Dec 06 00:00:03 crc kubenswrapper[4734]: I1206 00:00:03.302935 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-purge-29416320-kjz69" podStartSLOduration=3.302906035 podStartE2EDuration="3.302906035s" podCreationTimestamp="2025-12-06 00:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 00:00:03.277963386 +0000 UTC m=+2423.961367682" watchObservedRunningTime="2025-12-06 00:00:03.302906035 +0000 UTC m=+2423.986310311" Dec 06 00:00:03 crc kubenswrapper[4734]: I1206 00:00:03.342039 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-purge-29416320-x2nsf" podStartSLOduration=3.342010719 podStartE2EDuration="3.342010719s" podCreationTimestamp="2025-12-06 00:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 00:00:03.333508721 +0000 UTC m=+2424.016913007" watchObservedRunningTime="2025-12-06 00:00:03.342010719 +0000 UTC m=+2424.025414995" Dec 06 00:00:04 crc kubenswrapper[4734]: I1206 00:00:04.866804 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416320-9knj9" Dec 06 00:00:04 crc kubenswrapper[4734]: I1206 00:00:04.883481 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29416320-lkmf8" Dec 06 00:00:04 crc kubenswrapper[4734]: I1206 00:00:04.982119 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/af05c4c3-513e-45b8-8eda-3eecdbb6561a-serviceca\") pod \"af05c4c3-513e-45b8-8eda-3eecdbb6561a\" (UID: \"af05c4c3-513e-45b8-8eda-3eecdbb6561a\") " Dec 06 00:00:04 crc kubenswrapper[4734]: I1206 00:00:04.982913 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd-secret-volume\") pod \"1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd\" (UID: \"1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd\") " Dec 06 00:00:04 crc kubenswrapper[4734]: I1206 00:00:04.982955 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af05c4c3-513e-45b8-8eda-3eecdbb6561a-serviceca" (OuterVolumeSpecName: "serviceca") pod "af05c4c3-513e-45b8-8eda-3eecdbb6561a" (UID: "af05c4c3-513e-45b8-8eda-3eecdbb6561a"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 00:00:04 crc kubenswrapper[4734]: I1206 00:00:04.983043 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd-config-volume\") pod \"1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd\" (UID: \"1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd\") " Dec 06 00:00:04 crc kubenswrapper[4734]: I1206 00:00:04.986313 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd-config-volume" (OuterVolumeSpecName: "config-volume") pod "1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd" (UID: "1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 00:00:04 crc kubenswrapper[4734]: I1206 00:00:04.986790 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tj49l\" (UniqueName: \"kubernetes.io/projected/1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd-kube-api-access-tj49l\") pod \"1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd\" (UID: \"1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd\") " Dec 06 00:00:04 crc kubenswrapper[4734]: I1206 00:00:04.986875 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqxh5\" (UniqueName: \"kubernetes.io/projected/af05c4c3-513e-45b8-8eda-3eecdbb6561a-kube-api-access-hqxh5\") pod \"af05c4c3-513e-45b8-8eda-3eecdbb6561a\" (UID: \"af05c4c3-513e-45b8-8eda-3eecdbb6561a\") " Dec 06 00:00:04 crc kubenswrapper[4734]: I1206 00:00:04.993551 4734 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/af05c4c3-513e-45b8-8eda-3eecdbb6561a-serviceca\") on node \"crc\" DevicePath \"\"" Dec 06 00:00:04 crc kubenswrapper[4734]: I1206 00:00:04.993597 4734 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 00:00:05 crc kubenswrapper[4734]: I1206 00:00:05.000413 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd" (UID: "1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:00:05 crc kubenswrapper[4734]: I1206 00:00:05.004390 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd-kube-api-access-tj49l" (OuterVolumeSpecName: "kube-api-access-tj49l") pod "1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd" (UID: "1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd"). InnerVolumeSpecName "kube-api-access-tj49l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:00:05 crc kubenswrapper[4734]: I1206 00:00:05.004850 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af05c4c3-513e-45b8-8eda-3eecdbb6561a-kube-api-access-hqxh5" (OuterVolumeSpecName: "kube-api-access-hqxh5") pod "af05c4c3-513e-45b8-8eda-3eecdbb6561a" (UID: "af05c4c3-513e-45b8-8eda-3eecdbb6561a"). InnerVolumeSpecName "kube-api-access-hqxh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:00:05 crc kubenswrapper[4734]: I1206 00:00:05.095428 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tj49l\" (UniqueName: \"kubernetes.io/projected/1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd-kube-api-access-tj49l\") on node \"crc\" DevicePath \"\"" Dec 06 00:00:05 crc kubenswrapper[4734]: I1206 00:00:05.095490 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqxh5\" (UniqueName: \"kubernetes.io/projected/af05c4c3-513e-45b8-8eda-3eecdbb6561a-kube-api-access-hqxh5\") on node \"crc\" DevicePath \"\"" Dec 06 00:00:05 crc kubenswrapper[4734]: I1206 00:00:05.095514 4734 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 00:00:05 crc kubenswrapper[4734]: I1206 00:00:05.263547 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29416320-lkmf8" event={"ID":"af05c4c3-513e-45b8-8eda-3eecdbb6561a","Type":"ContainerDied","Data":"6b3911cd7da80982b839e59987f36a7c3c9012b189f588a93ff4af88eaaa2f13"} Dec 06 00:00:05 crc kubenswrapper[4734]: I1206 00:00:05.263615 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b3911cd7da80982b839e59987f36a7c3c9012b189f588a93ff4af88eaaa2f13" Dec 06 00:00:05 crc kubenswrapper[4734]: I1206 00:00:05.263625 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29416320-lkmf8" Dec 06 00:00:05 crc kubenswrapper[4734]: I1206 00:00:05.267988 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416320-9knj9" event={"ID":"1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd","Type":"ContainerDied","Data":"053f7226f5f9ffe31ccde492bea3e6f489dd750fd5badc5a4f2ce43c1a185697"} Dec 06 00:00:05 crc kubenswrapper[4734]: I1206 00:00:05.268039 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="053f7226f5f9ffe31ccde492bea3e6f489dd750fd5badc5a4f2ce43c1a185697" Dec 06 00:00:05 crc kubenswrapper[4734]: I1206 00:00:05.268073 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416320-9knj9" Dec 06 00:00:05 crc kubenswrapper[4734]: I1206 00:00:05.985861 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416275-mfkvp"] Dec 06 00:00:05 crc kubenswrapper[4734]: I1206 00:00:05.998967 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416275-mfkvp"] Dec 06 00:00:07 crc kubenswrapper[4734]: I1206 00:00:07.628334 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a20dbad-8352-4804-9c0e-a2b6108a0d1b" path="/var/lib/kubelet/pods/2a20dbad-8352-4804-9c0e-a2b6108a0d1b/volumes" Dec 06 00:00:09 crc kubenswrapper[4734]: I1206 00:00:09.334791 4734 generic.go:334] "Generic (PLEG): container finished" podID="1d498a8e-4ace-4a26-9c32-2dbc411c0b50" containerID="a6afed96214f62074d45fbb56944e1e7b62479278d49b5d609c452ce3c408a25" exitCode=0 Dec 06 00:00:09 crc kubenswrapper[4734]: I1206 00:00:09.335088 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-purge-29416320-x2nsf" event={"ID":"1d498a8e-4ace-4a26-9c32-2dbc411c0b50","Type":"ContainerDied","Data":"a6afed96214f62074d45fbb56944e1e7b62479278d49b5d609c452ce3c408a25"} Dec 06 00:00:09 crc kubenswrapper[4734]: I1206 00:00:09.342001 4734 generic.go:334] "Generic (PLEG): container finished" podID="e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5" containerID="b3dfba4e7a23363af9ec2267dfa176e26ca3b4e2843314968f4ac0d754f9f85c" exitCode=0 Dec 06 00:00:09 crc kubenswrapper[4734]: I1206 00:00:09.342075 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-purge-29416320-kjz69" event={"ID":"e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5","Type":"ContainerDied","Data":"b3dfba4e7a23363af9ec2267dfa176e26ca3b4e2843314968f4ac0d754f9f85c"} Dec 06 00:00:10 crc kubenswrapper[4734]: I1206 00:00:10.813508 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-purge-29416320-x2nsf" Dec 06 00:00:10 crc kubenswrapper[4734]: I1206 00:00:10.815733 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-purge-29416320-kjz69" Dec 06 00:00:10 crc kubenswrapper[4734]: I1206 00:00:10.931171 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d498a8e-4ace-4a26-9c32-2dbc411c0b50-config-data\") pod \"1d498a8e-4ace-4a26-9c32-2dbc411c0b50\" (UID: \"1d498a8e-4ace-4a26-9c32-2dbc411c0b50\") " Dec 06 00:00:10 crc kubenswrapper[4734]: I1206 00:00:10.931300 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68mx4\" (UniqueName: \"kubernetes.io/projected/e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5-kube-api-access-68mx4\") pod \"e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5\" (UID: \"e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5\") " Dec 06 00:00:10 crc kubenswrapper[4734]: I1206 00:00:10.931330 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr7l8\" (UniqueName: \"kubernetes.io/projected/1d498a8e-4ace-4a26-9c32-2dbc411c0b50-kube-api-access-rr7l8\") pod \"1d498a8e-4ace-4a26-9c32-2dbc411c0b50\" (UID: \"1d498a8e-4ace-4a26-9c32-2dbc411c0b50\") " Dec 06 00:00:10 crc kubenswrapper[4734]: I1206 00:00:10.931543 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5-config-data\") pod \"e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5\" (UID: \"e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5\") " Dec 06 00:00:10 crc kubenswrapper[4734]: I1206 00:00:10.931617 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d498a8e-4ace-4a26-9c32-2dbc411c0b50-scripts\") pod \"1d498a8e-4ace-4a26-9c32-2dbc411c0b50\" (UID: \"1d498a8e-4ace-4a26-9c32-2dbc411c0b50\") " Dec 06 00:00:10 crc kubenswrapper[4734]: I1206 00:00:10.931711 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5-combined-ca-bundle\") pod \"e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5\" (UID: \"e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5\") " Dec 06 00:00:10 crc kubenswrapper[4734]: I1206 00:00:10.931780 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d498a8e-4ace-4a26-9c32-2dbc411c0b50-combined-ca-bundle\") pod \"1d498a8e-4ace-4a26-9c32-2dbc411c0b50\" (UID: \"1d498a8e-4ace-4a26-9c32-2dbc411c0b50\") " Dec 06 00:00:10 crc kubenswrapper[4734]: I1206 00:00:10.931988 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5-scripts\") pod \"e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5\" (UID: \"e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5\") " Dec 06 00:00:10 crc kubenswrapper[4734]: I1206 00:00:10.956808 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d498a8e-4ace-4a26-9c32-2dbc411c0b50-scripts" (OuterVolumeSpecName: "scripts") pod "1d498a8e-4ace-4a26-9c32-2dbc411c0b50" (UID: "1d498a8e-4ace-4a26-9c32-2dbc411c0b50"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:00:10 crc kubenswrapper[4734]: I1206 00:00:10.970351 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5-kube-api-access-68mx4" (OuterVolumeSpecName: "kube-api-access-68mx4") pod "e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5" (UID: "e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5"). InnerVolumeSpecName "kube-api-access-68mx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:00:10 crc kubenswrapper[4734]: I1206 00:00:10.971012 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d498a8e-4ace-4a26-9c32-2dbc411c0b50-kube-api-access-rr7l8" (OuterVolumeSpecName: "kube-api-access-rr7l8") pod "1d498a8e-4ace-4a26-9c32-2dbc411c0b50" (UID: "1d498a8e-4ace-4a26-9c32-2dbc411c0b50"). InnerVolumeSpecName "kube-api-access-rr7l8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:00:10 crc kubenswrapper[4734]: I1206 00:00:10.974316 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5-scripts" (OuterVolumeSpecName: "scripts") pod "e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5" (UID: "e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:00:10 crc kubenswrapper[4734]: I1206 00:00:10.981588 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5-config-data" (OuterVolumeSpecName: "config-data") pod "e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5" (UID: "e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:00:10 crc kubenswrapper[4734]: I1206 00:00:10.982706 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d498a8e-4ace-4a26-9c32-2dbc411c0b50-config-data" (OuterVolumeSpecName: "config-data") pod "1d498a8e-4ace-4a26-9c32-2dbc411c0b50" (UID: "1d498a8e-4ace-4a26-9c32-2dbc411c0b50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:00:11 crc kubenswrapper[4734]: I1206 00:00:11.002977 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d498a8e-4ace-4a26-9c32-2dbc411c0b50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d498a8e-4ace-4a26-9c32-2dbc411c0b50" (UID: "1d498a8e-4ace-4a26-9c32-2dbc411c0b50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:00:11 crc kubenswrapper[4734]: I1206 00:00:11.013122 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5" (UID: "e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:00:11 crc kubenswrapper[4734]: I1206 00:00:11.034336 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 00:00:11 crc kubenswrapper[4734]: I1206 00:00:11.034373 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d498a8e-4ace-4a26-9c32-2dbc411c0b50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 00:00:11 crc kubenswrapper[4734]: I1206 00:00:11.034385 4734 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 00:00:11 crc kubenswrapper[4734]: I1206 00:00:11.034394 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d498a8e-4ace-4a26-9c32-2dbc411c0b50-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 00:00:11 crc kubenswrapper[4734]: I1206 00:00:11.034405 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr7l8\" (UniqueName: \"kubernetes.io/projected/1d498a8e-4ace-4a26-9c32-2dbc411c0b50-kube-api-access-rr7l8\") on node \"crc\" DevicePath \"\"" Dec 06 00:00:11 crc kubenswrapper[4734]: I1206 00:00:11.034415 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68mx4\" (UniqueName: \"kubernetes.io/projected/e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5-kube-api-access-68mx4\") on node \"crc\" DevicePath \"\"" Dec 06 00:00:11 crc kubenswrapper[4734]: I1206 00:00:11.034425 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 00:00:11 crc kubenswrapper[4734]: I1206 00:00:11.034433 4734 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d498a8e-4ace-4a26-9c32-2dbc411c0b50-scripts\") on node \"crc\" DevicePath \"\"" Dec 06 00:00:11 crc kubenswrapper[4734]: I1206 00:00:11.361452 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-purge-29416320-x2nsf" event={"ID":"1d498a8e-4ace-4a26-9c32-2dbc411c0b50","Type":"ContainerDied","Data":"6c3912d015c1ddb7720f7682f58c302fbb9d94a69b8e28020739c644776a63fc"} Dec 06 00:00:11 crc kubenswrapper[4734]: I1206 00:00:11.361940 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c3912d015c1ddb7720f7682f58c302fbb9d94a69b8e28020739c644776a63fc" Dec 06 00:00:11 crc kubenswrapper[4734]: I1206 00:00:11.361749 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-purge-29416320-x2nsf" Dec 06 00:00:11 crc kubenswrapper[4734]: I1206 00:00:11.363611 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-purge-29416320-kjz69" event={"ID":"e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5","Type":"ContainerDied","Data":"4ef27b3626906a70ecf00bc9bb4b21ef8a1f4723e0ebb052de842705aad47330"} Dec 06 00:00:11 crc kubenswrapper[4734]: I1206 00:00:11.363668 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ef27b3626906a70ecf00bc9bb4b21ef8a1f4723e0ebb052de842705aad47330" Dec 06 00:00:11 crc kubenswrapper[4734]: I1206 00:00:11.363682 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-purge-29416320-kjz69" Dec 06 00:00:16 crc kubenswrapper[4734]: I1206 00:00:16.614462 4734 scope.go:117] "RemoveContainer" containerID="c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762" Dec 06 00:00:16 crc kubenswrapper[4734]: E1206 00:00:16.615305 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:00:31 crc kubenswrapper[4734]: I1206 00:00:31.614451 4734 scope.go:117] "RemoveContainer" containerID="c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762" Dec 06 00:00:31 crc kubenswrapper[4734]: E1206 00:00:31.615880 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:00:40 crc kubenswrapper[4734]: I1206 00:00:40.661473 4734 generic.go:334] "Generic (PLEG): container finished" podID="85f32997-f801-4f60-b010-aaff637a8292" containerID="ff6b4e9aedfa59715254dd80adc0423be5feb648f27ecf91368e37cd83027e31" exitCode=0 Dec 06 00:00:40 crc kubenswrapper[4734]: I1206 00:00:40.661586 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk" event={"ID":"85f32997-f801-4f60-b010-aaff637a8292","Type":"ContainerDied","Data":"ff6b4e9aedfa59715254dd80adc0423be5feb648f27ecf91368e37cd83027e31"} Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.118807 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.167077 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f32997-f801-4f60-b010-aaff637a8292-libvirt-combined-ca-bundle\") pod \"85f32997-f801-4f60-b010-aaff637a8292\" (UID: \"85f32997-f801-4f60-b010-aaff637a8292\") " Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.167682 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crnzl\" (UniqueName: \"kubernetes.io/projected/85f32997-f801-4f60-b010-aaff637a8292-kube-api-access-crnzl\") pod \"85f32997-f801-4f60-b010-aaff637a8292\" (UID: \"85f32997-f801-4f60-b010-aaff637a8292\") " Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.167826 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85f32997-f801-4f60-b010-aaff637a8292-ssh-key\") pod \"85f32997-f801-4f60-b010-aaff637a8292\" (UID: \"85f32997-f801-4f60-b010-aaff637a8292\") " Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.168078 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85f32997-f801-4f60-b010-aaff637a8292-inventory\") pod \"85f32997-f801-4f60-b010-aaff637a8292\" (UID: \"85f32997-f801-4f60-b010-aaff637a8292\") " Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.168184 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/85f32997-f801-4f60-b010-aaff637a8292-libvirt-secret-0\") pod \"85f32997-f801-4f60-b010-aaff637a8292\" (UID: \"85f32997-f801-4f60-b010-aaff637a8292\") " Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.177750 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85f32997-f801-4f60-b010-aaff637a8292-kube-api-access-crnzl" (OuterVolumeSpecName: "kube-api-access-crnzl") pod "85f32997-f801-4f60-b010-aaff637a8292" (UID: "85f32997-f801-4f60-b010-aaff637a8292"). InnerVolumeSpecName "kube-api-access-crnzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.177973 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f32997-f801-4f60-b010-aaff637a8292-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "85f32997-f801-4f60-b010-aaff637a8292" (UID: "85f32997-f801-4f60-b010-aaff637a8292"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.204232 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f32997-f801-4f60-b010-aaff637a8292-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "85f32997-f801-4f60-b010-aaff637a8292" (UID: "85f32997-f801-4f60-b010-aaff637a8292"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.206570 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f32997-f801-4f60-b010-aaff637a8292-inventory" (OuterVolumeSpecName: "inventory") pod "85f32997-f801-4f60-b010-aaff637a8292" (UID: "85f32997-f801-4f60-b010-aaff637a8292"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.215163 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f32997-f801-4f60-b010-aaff637a8292-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "85f32997-f801-4f60-b010-aaff637a8292" (UID: "85f32997-f801-4f60-b010-aaff637a8292"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.271726 4734 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85f32997-f801-4f60-b010-aaff637a8292-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.271809 4734 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/85f32997-f801-4f60-b010-aaff637a8292-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.271829 4734 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f32997-f801-4f60-b010-aaff637a8292-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.271846 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crnzl\" (UniqueName: \"kubernetes.io/projected/85f32997-f801-4f60-b010-aaff637a8292-kube-api-access-crnzl\") on node \"crc\" DevicePath \"\"" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.271878 4734 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85f32997-f801-4f60-b010-aaff637a8292-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.685167 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk" event={"ID":"85f32997-f801-4f60-b010-aaff637a8292","Type":"ContainerDied","Data":"880a7d5e59069778fc05d1d251fa5c08e5cf66434b8b6dcd792a95603614120f"} Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.685617 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="880a7d5e59069778fc05d1d251fa5c08e5cf66434b8b6dcd792a95603614120f" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.685558 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.799066 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts"] Dec 06 00:00:42 crc kubenswrapper[4734]: E1206 00:00:42.799732 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd" containerName="collect-profiles" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.799763 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd" containerName="collect-profiles" Dec 06 00:00:42 crc kubenswrapper[4734]: E1206 00:00:42.799794 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85f32997-f801-4f60-b010-aaff637a8292" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.799804 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="85f32997-f801-4f60-b010-aaff637a8292" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 06 00:00:42 crc kubenswrapper[4734]: E1206 00:00:42.799818 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5" containerName="nova-manage" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.799827 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5" containerName="nova-manage" Dec 06 00:00:42 crc kubenswrapper[4734]: E1206 00:00:42.799865 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d498a8e-4ace-4a26-9c32-2dbc411c0b50" containerName="nova-manage" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.799873 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d498a8e-4ace-4a26-9c32-2dbc411c0b50" containerName="nova-manage" Dec 06 00:00:42 crc kubenswrapper[4734]: E1206 00:00:42.799888 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af05c4c3-513e-45b8-8eda-3eecdbb6561a" containerName="image-pruner" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.799897 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="af05c4c3-513e-45b8-8eda-3eecdbb6561a" containerName="image-pruner" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.800133 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="85f32997-f801-4f60-b010-aaff637a8292" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.800171 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d498a8e-4ace-4a26-9c32-2dbc411c0b50" containerName="nova-manage" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.800186 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5" containerName="nova-manage" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.800196 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="af05c4c3-513e-45b8-8eda-3eecdbb6561a" containerName="image-pruner" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.800208 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ba50a7d-69d9-4a30-8fbf-b3ef99257ebd" containerName="collect-profiles" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.801657 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.809669 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.809986 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.810169 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gsdqx" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.810363 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.810191 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.810243 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.811042 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.830985 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts"] Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.885239 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l7kts\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.885327 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l7kts\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.885378 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7d966291-cd7e-47ce-a95e-bee879371108-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l7kts\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.885399 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l7kts\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.885433 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72sz5\" (UniqueName: \"kubernetes.io/projected/7d966291-cd7e-47ce-a95e-bee879371108-kube-api-access-72sz5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l7kts\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.885462 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l7kts\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.885494 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l7kts\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.885537 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l7kts\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.885745 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l7kts\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.987813 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l7kts\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.987906 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7d966291-cd7e-47ce-a95e-bee879371108-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l7kts\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.987943 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l7kts\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.987988 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72sz5\" (UniqueName: \"kubernetes.io/projected/7d966291-cd7e-47ce-a95e-bee879371108-kube-api-access-72sz5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l7kts\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.988022 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l7kts\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.988056 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l7kts\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.988088 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l7kts\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.988168 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l7kts\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.988221 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l7kts\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.988896 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7d966291-cd7e-47ce-a95e-bee879371108-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l7kts\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.994819 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l7kts\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.994918 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l7kts\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.995563 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l7kts\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.995584 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l7kts\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:42 crc kubenswrapper[4734]: I1206 00:00:42.995870 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l7kts\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:43 crc kubenswrapper[4734]: I1206 00:00:43.001319 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l7kts\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:43 crc kubenswrapper[4734]: I1206 00:00:43.001852 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l7kts\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:43 crc kubenswrapper[4734]: I1206 00:00:43.009986 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72sz5\" (UniqueName: \"kubernetes.io/projected/7d966291-cd7e-47ce-a95e-bee879371108-kube-api-access-72sz5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l7kts\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:43 crc kubenswrapper[4734]: I1206 00:00:43.134290 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:00:43 crc kubenswrapper[4734]: I1206 00:00:43.655010 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts"] Dec 06 00:00:43 crc kubenswrapper[4734]: I1206 00:00:43.698727 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" event={"ID":"7d966291-cd7e-47ce-a95e-bee879371108","Type":"ContainerStarted","Data":"3fd0c392b5797ef34d1d7f89726a75b1206c3b3d3b559298ffb70c06abe46d10"} Dec 06 00:00:44 crc kubenswrapper[4734]: I1206 00:00:44.710922 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" event={"ID":"7d966291-cd7e-47ce-a95e-bee879371108","Type":"ContainerStarted","Data":"7a4c85c02cc2ef915f5df5361ef242f94ad4de6480e7f2b5eb7f86094d586dec"} Dec 06 00:00:44 crc kubenswrapper[4734]: I1206 00:00:44.738252 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" podStartSLOduration=2.211640312 podStartE2EDuration="2.738222203s" podCreationTimestamp="2025-12-06 00:00:42 +0000 UTC" firstStartedPulling="2025-12-06 00:00:43.663775401 +0000 UTC m=+2464.347179677" lastFinishedPulling="2025-12-06 00:00:44.190357292 +0000 UTC m=+2464.873761568" observedRunningTime="2025-12-06 00:00:44.730668359 +0000 UTC m=+2465.414072645" watchObservedRunningTime="2025-12-06 00:00:44.738222203 +0000 UTC m=+2465.421626479" Dec 06 00:00:46 crc kubenswrapper[4734]: I1206 00:00:46.614408 4734 scope.go:117] "RemoveContainer" containerID="c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762" Dec 06 00:00:46 crc kubenswrapper[4734]: E1206 00:00:46.615597 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:00:52 crc kubenswrapper[4734]: I1206 00:00:52.438692 4734 scope.go:117] "RemoveContainer" containerID="c2ab13668511b3efa65133e7ec2f85f1d91583ee811fb50b4e0a228eac2de9b8" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.166634 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29416321-nx5mh"] Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.169734 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416321-nx5mh" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.180208 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-purge-29416321-tqhwd"] Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.182106 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-purge-29416321-tqhwd" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.195080 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-purge-29416321-zr2lx"] Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.201651 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-purge-29416321-zr2lx" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.206258 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29416321-nx5mh"] Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.210453 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.220044 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-purge-29416321-tqhwd"] Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.234287 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-purge-29416321-zr2lx"] Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.287465 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e2a8f39-3819-46e4-9f5c-b2378637486f-config-data\") pod \"keystone-cron-29416321-nx5mh\" (UID: \"0e2a8f39-3819-46e4-9f5c-b2378637486f\") " pod="openstack/keystone-cron-29416321-nx5mh" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.287643 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/e30aee89-812f-4e60-997e-54de845b7afe-db-purge-config-data\") pod \"glance-db-purge-29416321-zr2lx\" (UID: \"e30aee89-812f-4e60-997e-54de845b7afe\") " pod="openstack/glance-db-purge-29416321-zr2lx" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.287688 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124-db-purge-config-data\") pod \"cinder-db-purge-29416321-tqhwd\" (UID: \"3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124\") " pod="openstack/cinder-db-purge-29416321-tqhwd" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.287710 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124-combined-ca-bundle\") pod \"cinder-db-purge-29416321-tqhwd\" (UID: \"3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124\") " pod="openstack/cinder-db-purge-29416321-tqhwd" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.287750 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0e2a8f39-3819-46e4-9f5c-b2378637486f-fernet-keys\") pod \"keystone-cron-29416321-nx5mh\" (UID: \"0e2a8f39-3819-46e4-9f5c-b2378637486f\") " pod="openstack/keystone-cron-29416321-nx5mh" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.287811 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e30aee89-812f-4e60-997e-54de845b7afe-config-data\") pod \"glance-db-purge-29416321-zr2lx\" (UID: \"e30aee89-812f-4e60-997e-54de845b7afe\") " pod="openstack/glance-db-purge-29416321-zr2lx" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.287834 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e2a8f39-3819-46e4-9f5c-b2378637486f-combined-ca-bundle\") pod \"keystone-cron-29416321-nx5mh\" (UID: \"0e2a8f39-3819-46e4-9f5c-b2378637486f\") " pod="openstack/keystone-cron-29416321-nx5mh" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.287898 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvl99\" (UniqueName: \"kubernetes.io/projected/3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124-kube-api-access-qvl99\") pod \"cinder-db-purge-29416321-tqhwd\" (UID: \"3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124\") " pod="openstack/cinder-db-purge-29416321-tqhwd" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.287954 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lhmd\" (UniqueName: \"kubernetes.io/projected/e30aee89-812f-4e60-997e-54de845b7afe-kube-api-access-2lhmd\") pod \"glance-db-purge-29416321-zr2lx\" (UID: \"e30aee89-812f-4e60-997e-54de845b7afe\") " pod="openstack/glance-db-purge-29416321-zr2lx" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.288005 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124-config-data\") pod \"cinder-db-purge-29416321-tqhwd\" (UID: \"3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124\") " pod="openstack/cinder-db-purge-29416321-tqhwd" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.288050 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e30aee89-812f-4e60-997e-54de845b7afe-combined-ca-bundle\") pod \"glance-db-purge-29416321-zr2lx\" (UID: \"e30aee89-812f-4e60-997e-54de845b7afe\") " pod="openstack/glance-db-purge-29416321-zr2lx" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.288079 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64lpv\" (UniqueName: \"kubernetes.io/projected/0e2a8f39-3819-46e4-9f5c-b2378637486f-kube-api-access-64lpv\") pod \"keystone-cron-29416321-nx5mh\" (UID: \"0e2a8f39-3819-46e4-9f5c-b2378637486f\") " pod="openstack/keystone-cron-29416321-nx5mh" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.390325 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e2a8f39-3819-46e4-9f5c-b2378637486f-config-data\") pod \"keystone-cron-29416321-nx5mh\" (UID: \"0e2a8f39-3819-46e4-9f5c-b2378637486f\") " pod="openstack/keystone-cron-29416321-nx5mh" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.390425 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/e30aee89-812f-4e60-997e-54de845b7afe-db-purge-config-data\") pod \"glance-db-purge-29416321-zr2lx\" (UID: \"e30aee89-812f-4e60-997e-54de845b7afe\") " pod="openstack/glance-db-purge-29416321-zr2lx" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.390467 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124-db-purge-config-data\") pod \"cinder-db-purge-29416321-tqhwd\" (UID: \"3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124\") " pod="openstack/cinder-db-purge-29416321-tqhwd" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.390487 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124-combined-ca-bundle\") pod \"cinder-db-purge-29416321-tqhwd\" (UID: \"3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124\") " pod="openstack/cinder-db-purge-29416321-tqhwd" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.390536 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0e2a8f39-3819-46e4-9f5c-b2378637486f-fernet-keys\") pod \"keystone-cron-29416321-nx5mh\" (UID: \"0e2a8f39-3819-46e4-9f5c-b2378637486f\") " pod="openstack/keystone-cron-29416321-nx5mh" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.390564 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e30aee89-812f-4e60-997e-54de845b7afe-config-data\") pod \"glance-db-purge-29416321-zr2lx\" (UID: \"e30aee89-812f-4e60-997e-54de845b7afe\") " pod="openstack/glance-db-purge-29416321-zr2lx" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.390593 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e2a8f39-3819-46e4-9f5c-b2378637486f-combined-ca-bundle\") pod \"keystone-cron-29416321-nx5mh\" (UID: \"0e2a8f39-3819-46e4-9f5c-b2378637486f\") " pod="openstack/keystone-cron-29416321-nx5mh" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.390638 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvl99\" (UniqueName: \"kubernetes.io/projected/3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124-kube-api-access-qvl99\") pod \"cinder-db-purge-29416321-tqhwd\" (UID: \"3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124\") " pod="openstack/cinder-db-purge-29416321-tqhwd" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.390683 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lhmd\" (UniqueName: \"kubernetes.io/projected/e30aee89-812f-4e60-997e-54de845b7afe-kube-api-access-2lhmd\") pod \"glance-db-purge-29416321-zr2lx\" (UID: \"e30aee89-812f-4e60-997e-54de845b7afe\") " pod="openstack/glance-db-purge-29416321-zr2lx" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.390719 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124-config-data\") pod \"cinder-db-purge-29416321-tqhwd\" (UID: \"3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124\") " pod="openstack/cinder-db-purge-29416321-tqhwd" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.390747 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64lpv\" (UniqueName: \"kubernetes.io/projected/0e2a8f39-3819-46e4-9f5c-b2378637486f-kube-api-access-64lpv\") pod \"keystone-cron-29416321-nx5mh\" (UID: \"0e2a8f39-3819-46e4-9f5c-b2378637486f\") " pod="openstack/keystone-cron-29416321-nx5mh" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.390765 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e30aee89-812f-4e60-997e-54de845b7afe-combined-ca-bundle\") pod \"glance-db-purge-29416321-zr2lx\" (UID: \"e30aee89-812f-4e60-997e-54de845b7afe\") " pod="openstack/glance-db-purge-29416321-zr2lx" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.399032 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e30aee89-812f-4e60-997e-54de845b7afe-config-data\") pod \"glance-db-purge-29416321-zr2lx\" (UID: \"e30aee89-812f-4e60-997e-54de845b7afe\") " pod="openstack/glance-db-purge-29416321-zr2lx" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.399066 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124-combined-ca-bundle\") pod \"cinder-db-purge-29416321-tqhwd\" (UID: \"3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124\") " pod="openstack/cinder-db-purge-29416321-tqhwd" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.399542 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0e2a8f39-3819-46e4-9f5c-b2378637486f-fernet-keys\") pod \"keystone-cron-29416321-nx5mh\" (UID: \"0e2a8f39-3819-46e4-9f5c-b2378637486f\") " pod="openstack/keystone-cron-29416321-nx5mh" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.400622 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124-config-data\") pod \"cinder-db-purge-29416321-tqhwd\" (UID: \"3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124\") " pod="openstack/cinder-db-purge-29416321-tqhwd" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.401773 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/e30aee89-812f-4e60-997e-54de845b7afe-db-purge-config-data\") pod \"glance-db-purge-29416321-zr2lx\" (UID: \"e30aee89-812f-4e60-997e-54de845b7afe\") " pod="openstack/glance-db-purge-29416321-zr2lx" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.402294 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e2a8f39-3819-46e4-9f5c-b2378637486f-combined-ca-bundle\") pod \"keystone-cron-29416321-nx5mh\" (UID: \"0e2a8f39-3819-46e4-9f5c-b2378637486f\") " pod="openstack/keystone-cron-29416321-nx5mh" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.403994 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124-db-purge-config-data\") pod \"cinder-db-purge-29416321-tqhwd\" (UID: \"3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124\") " pod="openstack/cinder-db-purge-29416321-tqhwd" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.405420 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e2a8f39-3819-46e4-9f5c-b2378637486f-config-data\") pod \"keystone-cron-29416321-nx5mh\" (UID: \"0e2a8f39-3819-46e4-9f5c-b2378637486f\") " pod="openstack/keystone-cron-29416321-nx5mh" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.407979 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e30aee89-812f-4e60-997e-54de845b7afe-combined-ca-bundle\") pod \"glance-db-purge-29416321-zr2lx\" (UID: \"e30aee89-812f-4e60-997e-54de845b7afe\") " pod="openstack/glance-db-purge-29416321-zr2lx" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.415335 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lhmd\" (UniqueName: \"kubernetes.io/projected/e30aee89-812f-4e60-997e-54de845b7afe-kube-api-access-2lhmd\") pod \"glance-db-purge-29416321-zr2lx\" (UID: \"e30aee89-812f-4e60-997e-54de845b7afe\") " pod="openstack/glance-db-purge-29416321-zr2lx" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.415453 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64lpv\" (UniqueName: \"kubernetes.io/projected/0e2a8f39-3819-46e4-9f5c-b2378637486f-kube-api-access-64lpv\") pod \"keystone-cron-29416321-nx5mh\" (UID: \"0e2a8f39-3819-46e4-9f5c-b2378637486f\") " pod="openstack/keystone-cron-29416321-nx5mh" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.416864 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvl99\" (UniqueName: \"kubernetes.io/projected/3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124-kube-api-access-qvl99\") pod \"cinder-db-purge-29416321-tqhwd\" (UID: \"3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124\") " pod="openstack/cinder-db-purge-29416321-tqhwd" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.507508 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416321-nx5mh" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.529702 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-purge-29416321-tqhwd" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.564182 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-purge-29416321-zr2lx" Dec 06 00:01:00 crc kubenswrapper[4734]: I1206 00:01:00.615738 4734 scope.go:117] "RemoveContainer" containerID="c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762" Dec 06 00:01:00 crc kubenswrapper[4734]: E1206 00:01:00.620175 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:01:01 crc kubenswrapper[4734]: I1206 00:01:01.056647 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29416321-nx5mh"] Dec 06 00:01:01 crc kubenswrapper[4734]: I1206 00:01:01.071628 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-purge-29416321-tqhwd"] Dec 06 00:01:01 crc kubenswrapper[4734]: I1206 00:01:01.195225 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-purge-29416321-zr2lx"] Dec 06 00:01:01 crc kubenswrapper[4734]: W1206 00:01:01.204949 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode30aee89_812f_4e60_997e_54de845b7afe.slice/crio-9e63981c8875f9125b534eb02fa273acf980c8857b69e97db49f0972404f8162 WatchSource:0}: Error finding container 9e63981c8875f9125b534eb02fa273acf980c8857b69e97db49f0972404f8162: Status 404 returned error can't find the container with id 9e63981c8875f9125b534eb02fa273acf980c8857b69e97db49f0972404f8162 Dec 06 00:01:01 crc kubenswrapper[4734]: I1206 00:01:01.921727 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-purge-29416321-tqhwd" event={"ID":"3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124","Type":"ContainerStarted","Data":"f47899d9addc753a0a8ed1cd2f3ac116fa9d6ff03fd1396408fcdfee31a53564"} Dec 06 00:01:01 crc kubenswrapper[4734]: I1206 00:01:01.928675 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416321-nx5mh" event={"ID":"0e2a8f39-3819-46e4-9f5c-b2378637486f","Type":"ContainerStarted","Data":"acf49d7498f26ec48cee8b863016765bda3f00ab8c7e9cc5434b88a6fafd2c4a"} Dec 06 00:01:01 crc kubenswrapper[4734]: I1206 00:01:01.929655 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416321-nx5mh" event={"ID":"0e2a8f39-3819-46e4-9f5c-b2378637486f","Type":"ContainerStarted","Data":"1330e3ebf841d52851bb9f5c26fbdbf97d1ad70946e44fe2e9639f9324bfa8e5"} Dec 06 00:01:01 crc kubenswrapper[4734]: I1206 00:01:01.933082 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-purge-29416321-zr2lx" event={"ID":"e30aee89-812f-4e60-997e-54de845b7afe","Type":"ContainerStarted","Data":"9e63981c8875f9125b534eb02fa273acf980c8857b69e97db49f0972404f8162"} Dec 06 00:01:01 crc kubenswrapper[4734]: I1206 00:01:01.950026 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-purge-29416321-tqhwd" podStartSLOduration=1.949999963 podStartE2EDuration="1.949999963s" podCreationTimestamp="2025-12-06 00:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 00:01:01.945498153 +0000 UTC m=+2482.628902439" watchObservedRunningTime="2025-12-06 00:01:01.949999963 +0000 UTC m=+2482.633404239" Dec 06 00:01:01 crc kubenswrapper[4734]: I1206 00:01:01.975462 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29416321-nx5mh" podStartSLOduration=1.975435353 podStartE2EDuration="1.975435353s" podCreationTimestamp="2025-12-06 00:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 00:01:01.965404129 +0000 UTC m=+2482.648808425" watchObservedRunningTime="2025-12-06 00:01:01.975435353 +0000 UTC m=+2482.658839629" Dec 06 00:01:02 crc kubenswrapper[4734]: I1206 00:01:02.944926 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-purge-29416321-tqhwd" event={"ID":"3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124","Type":"ContainerStarted","Data":"d1fc9c8d8b74d69086b1b06cdc9e7079772bc1c719fbe13c54fe6e5c6023529e"} Dec 06 00:01:02 crc kubenswrapper[4734]: I1206 00:01:02.946482 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-purge-29416321-zr2lx" event={"ID":"e30aee89-812f-4e60-997e-54de845b7afe","Type":"ContainerStarted","Data":"ccfa58af1e45864e461897a571fc39d0c6960b4156c8b8f7e70de1fe69175581"} Dec 06 00:01:02 crc kubenswrapper[4734]: I1206 00:01:02.984408 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-purge-29416321-zr2lx" podStartSLOduration=2.9843800480000002 podStartE2EDuration="2.984380048s" podCreationTimestamp="2025-12-06 00:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 00:01:02.978953535 +0000 UTC m=+2483.662357811" watchObservedRunningTime="2025-12-06 00:01:02.984380048 +0000 UTC m=+2483.667784324" Dec 06 00:01:03 crc kubenswrapper[4734]: I1206 00:01:03.958834 4734 generic.go:334] "Generic (PLEG): container finished" podID="e30aee89-812f-4e60-997e-54de845b7afe" containerID="ccfa58af1e45864e461897a571fc39d0c6960b4156c8b8f7e70de1fe69175581" exitCode=0 Dec 06 00:01:03 crc kubenswrapper[4734]: I1206 00:01:03.958900 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-purge-29416321-zr2lx" event={"ID":"e30aee89-812f-4e60-997e-54de845b7afe","Type":"ContainerDied","Data":"ccfa58af1e45864e461897a571fc39d0c6960b4156c8b8f7e70de1fe69175581"} Dec 06 00:01:03 crc kubenswrapper[4734]: I1206 00:01:03.961137 4734 generic.go:334] "Generic (PLEG): container finished" podID="0e2a8f39-3819-46e4-9f5c-b2378637486f" containerID="acf49d7498f26ec48cee8b863016765bda3f00ab8c7e9cc5434b88a6fafd2c4a" exitCode=0 Dec 06 00:01:03 crc kubenswrapper[4734]: I1206 00:01:03.961232 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416321-nx5mh" event={"ID":"0e2a8f39-3819-46e4-9f5c-b2378637486f","Type":"ContainerDied","Data":"acf49d7498f26ec48cee8b863016765bda3f00ab8c7e9cc5434b88a6fafd2c4a"} Dec 06 00:01:04 crc kubenswrapper[4734]: I1206 00:01:04.976325 4734 generic.go:334] "Generic (PLEG): container finished" podID="3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124" containerID="d1fc9c8d8b74d69086b1b06cdc9e7079772bc1c719fbe13c54fe6e5c6023529e" exitCode=0 Dec 06 00:01:04 crc kubenswrapper[4734]: I1206 00:01:04.976396 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-purge-29416321-tqhwd" event={"ID":"3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124","Type":"ContainerDied","Data":"d1fc9c8d8b74d69086b1b06cdc9e7079772bc1c719fbe13c54fe6e5c6023529e"} Dec 06 00:01:05 crc kubenswrapper[4734]: I1206 00:01:05.425319 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416321-nx5mh" Dec 06 00:01:05 crc kubenswrapper[4734]: I1206 00:01:05.433077 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-purge-29416321-zr2lx" Dec 06 00:01:05 crc kubenswrapper[4734]: I1206 00:01:05.520229 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e30aee89-812f-4e60-997e-54de845b7afe-combined-ca-bundle\") pod \"e30aee89-812f-4e60-997e-54de845b7afe\" (UID: \"e30aee89-812f-4e60-997e-54de845b7afe\") " Dec 06 00:01:05 crc kubenswrapper[4734]: I1206 00:01:05.520348 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0e2a8f39-3819-46e4-9f5c-b2378637486f-fernet-keys\") pod \"0e2a8f39-3819-46e4-9f5c-b2378637486f\" (UID: \"0e2a8f39-3819-46e4-9f5c-b2378637486f\") " Dec 06 00:01:05 crc kubenswrapper[4734]: I1206 00:01:05.520424 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e2a8f39-3819-46e4-9f5c-b2378637486f-combined-ca-bundle\") pod \"0e2a8f39-3819-46e4-9f5c-b2378637486f\" (UID: \"0e2a8f39-3819-46e4-9f5c-b2378637486f\") " Dec 06 00:01:05 crc kubenswrapper[4734]: I1206 00:01:05.520509 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64lpv\" (UniqueName: \"kubernetes.io/projected/0e2a8f39-3819-46e4-9f5c-b2378637486f-kube-api-access-64lpv\") pod \"0e2a8f39-3819-46e4-9f5c-b2378637486f\" (UID: \"0e2a8f39-3819-46e4-9f5c-b2378637486f\") " Dec 06 00:01:05 crc kubenswrapper[4734]: I1206 00:01:05.520595 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/e30aee89-812f-4e60-997e-54de845b7afe-db-purge-config-data\") pod \"e30aee89-812f-4e60-997e-54de845b7afe\" (UID: \"e30aee89-812f-4e60-997e-54de845b7afe\") " Dec 06 00:01:05 crc kubenswrapper[4734]: I1206 00:01:05.520618 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e2a8f39-3819-46e4-9f5c-b2378637486f-config-data\") pod \"0e2a8f39-3819-46e4-9f5c-b2378637486f\" (UID: \"0e2a8f39-3819-46e4-9f5c-b2378637486f\") " Dec 06 00:01:05 crc kubenswrapper[4734]: I1206 00:01:05.520669 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lhmd\" (UniqueName: \"kubernetes.io/projected/e30aee89-812f-4e60-997e-54de845b7afe-kube-api-access-2lhmd\") pod \"e30aee89-812f-4e60-997e-54de845b7afe\" (UID: \"e30aee89-812f-4e60-997e-54de845b7afe\") " Dec 06 00:01:05 crc kubenswrapper[4734]: I1206 00:01:05.520701 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e30aee89-812f-4e60-997e-54de845b7afe-config-data\") pod \"e30aee89-812f-4e60-997e-54de845b7afe\" (UID: \"e30aee89-812f-4e60-997e-54de845b7afe\") " Dec 06 00:01:05 crc kubenswrapper[4734]: I1206 00:01:05.527913 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e2a8f39-3819-46e4-9f5c-b2378637486f-kube-api-access-64lpv" (OuterVolumeSpecName: "kube-api-access-64lpv") pod "0e2a8f39-3819-46e4-9f5c-b2378637486f" (UID: "0e2a8f39-3819-46e4-9f5c-b2378637486f"). InnerVolumeSpecName "kube-api-access-64lpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:01:05 crc kubenswrapper[4734]: I1206 00:01:05.528672 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e30aee89-812f-4e60-997e-54de845b7afe-kube-api-access-2lhmd" (OuterVolumeSpecName: "kube-api-access-2lhmd") pod "e30aee89-812f-4e60-997e-54de845b7afe" (UID: "e30aee89-812f-4e60-997e-54de845b7afe"). InnerVolumeSpecName "kube-api-access-2lhmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:01:05 crc kubenswrapper[4734]: I1206 00:01:05.529293 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e30aee89-812f-4e60-997e-54de845b7afe-db-purge-config-data" (OuterVolumeSpecName: "db-purge-config-data") pod "e30aee89-812f-4e60-997e-54de845b7afe" (UID: "e30aee89-812f-4e60-997e-54de845b7afe"). InnerVolumeSpecName "db-purge-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:01:05 crc kubenswrapper[4734]: I1206 00:01:05.529711 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e2a8f39-3819-46e4-9f5c-b2378637486f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0e2a8f39-3819-46e4-9f5c-b2378637486f" (UID: "0e2a8f39-3819-46e4-9f5c-b2378637486f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:01:05 crc kubenswrapper[4734]: I1206 00:01:05.555879 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e30aee89-812f-4e60-997e-54de845b7afe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e30aee89-812f-4e60-997e-54de845b7afe" (UID: "e30aee89-812f-4e60-997e-54de845b7afe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:01:05 crc kubenswrapper[4734]: I1206 00:01:05.556452 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e30aee89-812f-4e60-997e-54de845b7afe-config-data" (OuterVolumeSpecName: "config-data") pod "e30aee89-812f-4e60-997e-54de845b7afe" (UID: "e30aee89-812f-4e60-997e-54de845b7afe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:01:05 crc kubenswrapper[4734]: I1206 00:01:05.566820 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e2a8f39-3819-46e4-9f5c-b2378637486f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e2a8f39-3819-46e4-9f5c-b2378637486f" (UID: "0e2a8f39-3819-46e4-9f5c-b2378637486f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:01:05 crc kubenswrapper[4734]: I1206 00:01:05.593430 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e2a8f39-3819-46e4-9f5c-b2378637486f-config-data" (OuterVolumeSpecName: "config-data") pod "0e2a8f39-3819-46e4-9f5c-b2378637486f" (UID: "0e2a8f39-3819-46e4-9f5c-b2378637486f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:01:05 crc kubenswrapper[4734]: I1206 00:01:05.636623 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64lpv\" (UniqueName: \"kubernetes.io/projected/0e2a8f39-3819-46e4-9f5c-b2378637486f-kube-api-access-64lpv\") on node \"crc\" DevicePath \"\"" Dec 06 00:01:05 crc kubenswrapper[4734]: I1206 00:01:05.636678 4734 reconciler_common.go:293] "Volume detached for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/e30aee89-812f-4e60-997e-54de845b7afe-db-purge-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 00:01:05 crc kubenswrapper[4734]: I1206 00:01:05.636695 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e2a8f39-3819-46e4-9f5c-b2378637486f-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 00:01:05 crc kubenswrapper[4734]: I1206 00:01:05.636710 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lhmd\" (UniqueName: \"kubernetes.io/projected/e30aee89-812f-4e60-997e-54de845b7afe-kube-api-access-2lhmd\") on node \"crc\" DevicePath \"\"" Dec 06 00:01:05 crc kubenswrapper[4734]: I1206 00:01:05.636728 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e30aee89-812f-4e60-997e-54de845b7afe-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 00:01:05 crc kubenswrapper[4734]: I1206 00:01:05.636741 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e30aee89-812f-4e60-997e-54de845b7afe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 00:01:05 crc kubenswrapper[4734]: I1206 00:01:05.636755 4734 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0e2a8f39-3819-46e4-9f5c-b2378637486f-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 06 00:01:05 crc kubenswrapper[4734]: I1206 00:01:05.636772 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e2a8f39-3819-46e4-9f5c-b2378637486f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 00:01:06 crc kubenswrapper[4734]: I1206 00:01:06.000693 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-purge-29416321-zr2lx" event={"ID":"e30aee89-812f-4e60-997e-54de845b7afe","Type":"ContainerDied","Data":"9e63981c8875f9125b534eb02fa273acf980c8857b69e97db49f0972404f8162"} Dec 06 00:01:06 crc kubenswrapper[4734]: I1206 00:01:06.001193 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e63981c8875f9125b534eb02fa273acf980c8857b69e97db49f0972404f8162" Dec 06 00:01:06 crc kubenswrapper[4734]: I1206 00:01:06.000749 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-purge-29416321-zr2lx" Dec 06 00:01:06 crc kubenswrapper[4734]: I1206 00:01:06.008445 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29416321-nx5mh" Dec 06 00:01:06 crc kubenswrapper[4734]: I1206 00:01:06.008468 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29416321-nx5mh" event={"ID":"0e2a8f39-3819-46e4-9f5c-b2378637486f","Type":"ContainerDied","Data":"1330e3ebf841d52851bb9f5c26fbdbf97d1ad70946e44fe2e9639f9324bfa8e5"} Dec 06 00:01:06 crc kubenswrapper[4734]: I1206 00:01:06.008948 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1330e3ebf841d52851bb9f5c26fbdbf97d1ad70946e44fe2e9639f9324bfa8e5" Dec 06 00:01:06 crc kubenswrapper[4734]: I1206 00:01:06.381881 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-purge-29416321-tqhwd" Dec 06 00:01:06 crc kubenswrapper[4734]: I1206 00:01:06.458100 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124-db-purge-config-data\") pod \"3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124\" (UID: \"3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124\") " Dec 06 00:01:06 crc kubenswrapper[4734]: I1206 00:01:06.458549 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvl99\" (UniqueName: \"kubernetes.io/projected/3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124-kube-api-access-qvl99\") pod \"3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124\" (UID: \"3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124\") " Dec 06 00:01:06 crc kubenswrapper[4734]: I1206 00:01:06.458715 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124-config-data\") pod \"3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124\" (UID: \"3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124\") " Dec 06 00:01:06 crc kubenswrapper[4734]: I1206 00:01:06.459057 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124-combined-ca-bundle\") pod \"3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124\" (UID: \"3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124\") " Dec 06 00:01:06 crc kubenswrapper[4734]: I1206 00:01:06.471505 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124-kube-api-access-qvl99" (OuterVolumeSpecName: "kube-api-access-qvl99") pod "3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124" (UID: "3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124"). InnerVolumeSpecName "kube-api-access-qvl99". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:01:06 crc kubenswrapper[4734]: I1206 00:01:06.471702 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124-db-purge-config-data" (OuterVolumeSpecName: "db-purge-config-data") pod "3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124" (UID: "3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124"). InnerVolumeSpecName "db-purge-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:01:06 crc kubenswrapper[4734]: I1206 00:01:06.494278 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124" (UID: "3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:01:06 crc kubenswrapper[4734]: I1206 00:01:06.503051 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124-config-data" (OuterVolumeSpecName: "config-data") pod "3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124" (UID: "3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:01:06 crc kubenswrapper[4734]: I1206 00:01:06.562025 4734 reconciler_common.go:293] "Volume detached for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124-db-purge-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 00:01:06 crc kubenswrapper[4734]: I1206 00:01:06.562087 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvl99\" (UniqueName: \"kubernetes.io/projected/3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124-kube-api-access-qvl99\") on node \"crc\" DevicePath \"\"" Dec 06 00:01:06 crc kubenswrapper[4734]: I1206 00:01:06.562108 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 00:01:06 crc kubenswrapper[4734]: I1206 00:01:06.562124 4734 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 00:01:07 crc kubenswrapper[4734]: I1206 00:01:07.022574 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-purge-29416321-tqhwd" event={"ID":"3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124","Type":"ContainerDied","Data":"f47899d9addc753a0a8ed1cd2f3ac116fa9d6ff03fd1396408fcdfee31a53564"} Dec 06 00:01:07 crc kubenswrapper[4734]: I1206 00:01:07.022632 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f47899d9addc753a0a8ed1cd2f3ac116fa9d6ff03fd1396408fcdfee31a53564" Dec 06 00:01:07 crc kubenswrapper[4734]: I1206 00:01:07.025478 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-purge-29416321-tqhwd" Dec 06 00:01:13 crc kubenswrapper[4734]: I1206 00:01:13.614325 4734 scope.go:117] "RemoveContainer" containerID="c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762" Dec 06 00:01:13 crc kubenswrapper[4734]: E1206 00:01:13.615149 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:01:27 crc kubenswrapper[4734]: I1206 00:01:27.615208 4734 scope.go:117] "RemoveContainer" containerID="c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762" Dec 06 00:01:27 crc kubenswrapper[4734]: E1206 00:01:27.616511 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:01:42 crc kubenswrapper[4734]: I1206 00:01:42.615144 4734 scope.go:117] "RemoveContainer" containerID="c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762" Dec 06 00:01:42 crc kubenswrapper[4734]: E1206 00:01:42.616307 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:01:57 crc kubenswrapper[4734]: I1206 00:01:57.613932 4734 scope.go:117] "RemoveContainer" containerID="c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762" Dec 06 00:01:57 crc kubenswrapper[4734]: E1206 00:01:57.615639 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:02:12 crc kubenswrapper[4734]: I1206 00:02:12.615773 4734 scope.go:117] "RemoveContainer" containerID="c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762" Dec 06 00:02:12 crc kubenswrapper[4734]: E1206 00:02:12.616765 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:02:25 crc kubenswrapper[4734]: I1206 00:02:25.614787 4734 scope.go:117] "RemoveContainer" containerID="c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762" Dec 06 00:02:25 crc kubenswrapper[4734]: E1206 00:02:25.617109 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:02:39 crc kubenswrapper[4734]: I1206 00:02:39.622407 4734 scope.go:117] "RemoveContainer" containerID="c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762" Dec 06 00:02:39 crc kubenswrapper[4734]: E1206 00:02:39.623492 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:02:52 crc kubenswrapper[4734]: I1206 00:02:52.615827 4734 scope.go:117] "RemoveContainer" containerID="c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762" Dec 06 00:02:53 crc kubenswrapper[4734]: I1206 00:02:53.147635 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerStarted","Data":"4163ec915cc478d242c1011fc445081e62cb49bf78f698e993005b968d9348bf"} Dec 06 00:04:38 crc kubenswrapper[4734]: I1206 00:04:38.391505 4734 generic.go:334] "Generic (PLEG): container finished" podID="7d966291-cd7e-47ce-a95e-bee879371108" containerID="7a4c85c02cc2ef915f5df5361ef242f94ad4de6480e7f2b5eb7f86094d586dec" exitCode=0 Dec 06 00:04:38 crc kubenswrapper[4734]: I1206 00:04:38.392494 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" event={"ID":"7d966291-cd7e-47ce-a95e-bee879371108","Type":"ContainerDied","Data":"7a4c85c02cc2ef915f5df5361ef242f94ad4de6480e7f2b5eb7f86094d586dec"} Dec 06 00:04:39 crc kubenswrapper[4734]: I1206 00:04:39.944194 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.145660 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7d966291-cd7e-47ce-a95e-bee879371108-nova-extra-config-0\") pod \"7d966291-cd7e-47ce-a95e-bee879371108\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.145781 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-migration-ssh-key-1\") pod \"7d966291-cd7e-47ce-a95e-bee879371108\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.145857 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-inventory\") pod \"7d966291-cd7e-47ce-a95e-bee879371108\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.145892 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-ssh-key\") pod \"7d966291-cd7e-47ce-a95e-bee879371108\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.145908 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-migration-ssh-key-0\") pod \"7d966291-cd7e-47ce-a95e-bee879371108\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.146068 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-combined-ca-bundle\") pod \"7d966291-cd7e-47ce-a95e-bee879371108\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.146122 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-cell1-compute-config-1\") pod \"7d966291-cd7e-47ce-a95e-bee879371108\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.146162 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-cell1-compute-config-0\") pod \"7d966291-cd7e-47ce-a95e-bee879371108\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.146829 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72sz5\" (UniqueName: \"kubernetes.io/projected/7d966291-cd7e-47ce-a95e-bee879371108-kube-api-access-72sz5\") pod \"7d966291-cd7e-47ce-a95e-bee879371108\" (UID: \"7d966291-cd7e-47ce-a95e-bee879371108\") " Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.154387 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "7d966291-cd7e-47ce-a95e-bee879371108" (UID: "7d966291-cd7e-47ce-a95e-bee879371108"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.154583 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d966291-cd7e-47ce-a95e-bee879371108-kube-api-access-72sz5" (OuterVolumeSpecName: "kube-api-access-72sz5") pod "7d966291-cd7e-47ce-a95e-bee879371108" (UID: "7d966291-cd7e-47ce-a95e-bee879371108"). InnerVolumeSpecName "kube-api-access-72sz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.186607 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "7d966291-cd7e-47ce-a95e-bee879371108" (UID: "7d966291-cd7e-47ce-a95e-bee879371108"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.192084 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7d966291-cd7e-47ce-a95e-bee879371108" (UID: "7d966291-cd7e-47ce-a95e-bee879371108"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.197197 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-inventory" (OuterVolumeSpecName: "inventory") pod "7d966291-cd7e-47ce-a95e-bee879371108" (UID: "7d966291-cd7e-47ce-a95e-bee879371108"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.201192 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d966291-cd7e-47ce-a95e-bee879371108-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "7d966291-cd7e-47ce-a95e-bee879371108" (UID: "7d966291-cd7e-47ce-a95e-bee879371108"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.202178 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "7d966291-cd7e-47ce-a95e-bee879371108" (UID: "7d966291-cd7e-47ce-a95e-bee879371108"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.210130 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "7d966291-cd7e-47ce-a95e-bee879371108" (UID: "7d966291-cd7e-47ce-a95e-bee879371108"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.210728 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "7d966291-cd7e-47ce-a95e-bee879371108" (UID: "7d966291-cd7e-47ce-a95e-bee879371108"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.249247 4734 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.249324 4734 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.249368 4734 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.249378 4734 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.249387 4734 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.249396 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72sz5\" (UniqueName: \"kubernetes.io/projected/7d966291-cd7e-47ce-a95e-bee879371108-kube-api-access-72sz5\") on node \"crc\" DevicePath \"\"" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.249409 4734 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7d966291-cd7e-47ce-a95e-bee879371108-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.249417 4734 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.249425 4734 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d966291-cd7e-47ce-a95e-bee879371108-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.434468 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" event={"ID":"7d966291-cd7e-47ce-a95e-bee879371108","Type":"ContainerDied","Data":"3fd0c392b5797ef34d1d7f89726a75b1206c3b3d3b559298ffb70c06abe46d10"} Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.434547 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fd0c392b5797ef34d1d7f89726a75b1206c3b3d3b559298ffb70c06abe46d10" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.434660 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l7kts" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.591113 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb"] Dec 06 00:04:40 crc kubenswrapper[4734]: E1206 00:04:40.592163 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e2a8f39-3819-46e4-9f5c-b2378637486f" containerName="keystone-cron" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.592186 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e2a8f39-3819-46e4-9f5c-b2378637486f" containerName="keystone-cron" Dec 06 00:04:40 crc kubenswrapper[4734]: E1206 00:04:40.592215 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124" containerName="cinder-db-purge" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.592222 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124" containerName="cinder-db-purge" Dec 06 00:04:40 crc kubenswrapper[4734]: E1206 00:04:40.592234 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e30aee89-812f-4e60-997e-54de845b7afe" containerName="glance-dbpurge" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.592242 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="e30aee89-812f-4e60-997e-54de845b7afe" containerName="glance-dbpurge" Dec 06 00:04:40 crc kubenswrapper[4734]: E1206 00:04:40.592260 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d966291-cd7e-47ce-a95e-bee879371108" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.592267 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d966291-cd7e-47ce-a95e-bee879371108" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.592449 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="e30aee89-812f-4e60-997e-54de845b7afe" containerName="glance-dbpurge" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.592473 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e2a8f39-3819-46e4-9f5c-b2378637486f" containerName="keystone-cron" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.592495 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124" containerName="cinder-db-purge" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.592506 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d966291-cd7e-47ce-a95e-bee879371108" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.593349 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.598001 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.598038 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gsdqx" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.598217 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.598267 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.598367 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.612116 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb"] Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.760188 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb\" (UID: \"039811b0-a938-445d-b5a4-702b526f8356\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.760338 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb\" (UID: \"039811b0-a938-445d-b5a4-702b526f8356\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.760418 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb\" (UID: \"039811b0-a938-445d-b5a4-702b526f8356\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.760574 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb\" (UID: \"039811b0-a938-445d-b5a4-702b526f8356\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.760619 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wt5x\" (UniqueName: \"kubernetes.io/projected/039811b0-a938-445d-b5a4-702b526f8356-kube-api-access-9wt5x\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb\" (UID: \"039811b0-a938-445d-b5a4-702b526f8356\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.760643 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb\" (UID: \"039811b0-a938-445d-b5a4-702b526f8356\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.760692 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb\" (UID: \"039811b0-a938-445d-b5a4-702b526f8356\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.863031 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb\" (UID: \"039811b0-a938-445d-b5a4-702b526f8356\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.863107 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb\" (UID: \"039811b0-a938-445d-b5a4-702b526f8356\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.863157 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb\" (UID: \"039811b0-a938-445d-b5a4-702b526f8356\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.863213 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb\" (UID: \"039811b0-a938-445d-b5a4-702b526f8356\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.863246 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wt5x\" (UniqueName: \"kubernetes.io/projected/039811b0-a938-445d-b5a4-702b526f8356-kube-api-access-9wt5x\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb\" (UID: \"039811b0-a938-445d-b5a4-702b526f8356\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.863276 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb\" (UID: \"039811b0-a938-445d-b5a4-702b526f8356\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.863331 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb\" (UID: \"039811b0-a938-445d-b5a4-702b526f8356\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.869420 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb\" (UID: \"039811b0-a938-445d-b5a4-702b526f8356\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.869458 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb\" (UID: \"039811b0-a938-445d-b5a4-702b526f8356\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.870006 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb\" (UID: \"039811b0-a938-445d-b5a4-702b526f8356\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.871235 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb\" (UID: \"039811b0-a938-445d-b5a4-702b526f8356\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.873014 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb\" (UID: \"039811b0-a938-445d-b5a4-702b526f8356\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.873989 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb\" (UID: \"039811b0-a938-445d-b5a4-702b526f8356\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.886480 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wt5x\" (UniqueName: \"kubernetes.io/projected/039811b0-a938-445d-b5a4-702b526f8356-kube-api-access-9wt5x\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb\" (UID: \"039811b0-a938-445d-b5a4-702b526f8356\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" Dec 06 00:04:40 crc kubenswrapper[4734]: I1206 00:04:40.915772 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" Dec 06 00:04:41 crc kubenswrapper[4734]: I1206 00:04:41.635465 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb"] Dec 06 00:04:41 crc kubenswrapper[4734]: I1206 00:04:41.654765 4734 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 00:04:42 crc kubenswrapper[4734]: I1206 00:04:42.462340 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" event={"ID":"039811b0-a938-445d-b5a4-702b526f8356","Type":"ContainerStarted","Data":"beca813844c257715680290ba8ebf794fa607f38b784740e93669e632d8255fb"} Dec 06 00:04:42 crc kubenswrapper[4734]: I1206 00:04:42.463071 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" event={"ID":"039811b0-a938-445d-b5a4-702b526f8356","Type":"ContainerStarted","Data":"d48a5e73af1a73a8ade5764006281871484b6b75ad4752bf32a501c7e01a5b41"} Dec 06 00:04:42 crc kubenswrapper[4734]: I1206 00:04:42.483044 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" podStartSLOduration=1.942940139 podStartE2EDuration="2.483017103s" podCreationTimestamp="2025-12-06 00:04:40 +0000 UTC" firstStartedPulling="2025-12-06 00:04:41.654305704 +0000 UTC m=+2702.337709990" lastFinishedPulling="2025-12-06 00:04:42.194382678 +0000 UTC m=+2702.877786954" observedRunningTime="2025-12-06 00:04:42.481186168 +0000 UTC m=+2703.164590444" watchObservedRunningTime="2025-12-06 00:04:42.483017103 +0000 UTC m=+2703.166421379" Dec 06 00:04:55 crc kubenswrapper[4734]: I1206 00:04:55.323446 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p5hlw"] Dec 06 00:04:55 crc kubenswrapper[4734]: I1206 00:04:55.326445 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p5hlw" Dec 06 00:04:55 crc kubenswrapper[4734]: I1206 00:04:55.340129 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p5hlw"] Dec 06 00:04:55 crc kubenswrapper[4734]: I1206 00:04:55.381215 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b49b196-a168-46e0-8267-80a4628cb4cd-catalog-content\") pod \"redhat-operators-p5hlw\" (UID: \"2b49b196-a168-46e0-8267-80a4628cb4cd\") " pod="openshift-marketplace/redhat-operators-p5hlw" Dec 06 00:04:55 crc kubenswrapper[4734]: I1206 00:04:55.381404 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b49b196-a168-46e0-8267-80a4628cb4cd-utilities\") pod \"redhat-operators-p5hlw\" (UID: \"2b49b196-a168-46e0-8267-80a4628cb4cd\") " pod="openshift-marketplace/redhat-operators-p5hlw" Dec 06 00:04:55 crc kubenswrapper[4734]: I1206 00:04:55.381457 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwlrg\" (UniqueName: \"kubernetes.io/projected/2b49b196-a168-46e0-8267-80a4628cb4cd-kube-api-access-jwlrg\") pod \"redhat-operators-p5hlw\" (UID: \"2b49b196-a168-46e0-8267-80a4628cb4cd\") " pod="openshift-marketplace/redhat-operators-p5hlw" Dec 06 00:04:55 crc kubenswrapper[4734]: I1206 00:04:55.484312 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b49b196-a168-46e0-8267-80a4628cb4cd-catalog-content\") pod \"redhat-operators-p5hlw\" (UID: \"2b49b196-a168-46e0-8267-80a4628cb4cd\") " pod="openshift-marketplace/redhat-operators-p5hlw" Dec 06 00:04:55 crc kubenswrapper[4734]: I1206 00:04:55.484765 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b49b196-a168-46e0-8267-80a4628cb4cd-utilities\") pod \"redhat-operators-p5hlw\" (UID: \"2b49b196-a168-46e0-8267-80a4628cb4cd\") " pod="openshift-marketplace/redhat-operators-p5hlw" Dec 06 00:04:55 crc kubenswrapper[4734]: I1206 00:04:55.484888 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b49b196-a168-46e0-8267-80a4628cb4cd-catalog-content\") pod \"redhat-operators-p5hlw\" (UID: \"2b49b196-a168-46e0-8267-80a4628cb4cd\") " pod="openshift-marketplace/redhat-operators-p5hlw" Dec 06 00:04:55 crc kubenswrapper[4734]: I1206 00:04:55.484897 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwlrg\" (UniqueName: \"kubernetes.io/projected/2b49b196-a168-46e0-8267-80a4628cb4cd-kube-api-access-jwlrg\") pod \"redhat-operators-p5hlw\" (UID: \"2b49b196-a168-46e0-8267-80a4628cb4cd\") " pod="openshift-marketplace/redhat-operators-p5hlw" Dec 06 00:04:55 crc kubenswrapper[4734]: I1206 00:04:55.485400 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b49b196-a168-46e0-8267-80a4628cb4cd-utilities\") pod \"redhat-operators-p5hlw\" (UID: \"2b49b196-a168-46e0-8267-80a4628cb4cd\") " pod="openshift-marketplace/redhat-operators-p5hlw" Dec 06 00:04:55 crc kubenswrapper[4734]: I1206 00:04:55.509241 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwlrg\" (UniqueName: \"kubernetes.io/projected/2b49b196-a168-46e0-8267-80a4628cb4cd-kube-api-access-jwlrg\") pod \"redhat-operators-p5hlw\" (UID: \"2b49b196-a168-46e0-8267-80a4628cb4cd\") " pod="openshift-marketplace/redhat-operators-p5hlw" Dec 06 00:04:55 crc kubenswrapper[4734]: I1206 00:04:55.650473 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p5hlw" Dec 06 00:04:56 crc kubenswrapper[4734]: I1206 00:04:56.276648 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p5hlw"] Dec 06 00:04:56 crc kubenswrapper[4734]: I1206 00:04:56.612229 4734 generic.go:334] "Generic (PLEG): container finished" podID="2b49b196-a168-46e0-8267-80a4628cb4cd" containerID="c24b6d887b554e89301a6483ae457584b6d65abf19cf417f5efe2f6cd17bb8c0" exitCode=0 Dec 06 00:04:56 crc kubenswrapper[4734]: I1206 00:04:56.612302 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p5hlw" event={"ID":"2b49b196-a168-46e0-8267-80a4628cb4cd","Type":"ContainerDied","Data":"c24b6d887b554e89301a6483ae457584b6d65abf19cf417f5efe2f6cd17bb8c0"} Dec 06 00:04:56 crc kubenswrapper[4734]: I1206 00:04:56.612345 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p5hlw" event={"ID":"2b49b196-a168-46e0-8267-80a4628cb4cd","Type":"ContainerStarted","Data":"996cef42633ab61393e47585f87b89eec26e43a02a9f17c785fa273e40bb7b58"} Dec 06 00:04:57 crc kubenswrapper[4734]: I1206 00:04:57.626039 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p5hlw" event={"ID":"2b49b196-a168-46e0-8267-80a4628cb4cd","Type":"ContainerStarted","Data":"e07922991962784b7e34f3b3448bb82525c7598b56bf5d985957e5085d5472c6"} Dec 06 00:04:58 crc kubenswrapper[4734]: I1206 00:04:58.639807 4734 generic.go:334] "Generic (PLEG): container finished" podID="2b49b196-a168-46e0-8267-80a4628cb4cd" containerID="e07922991962784b7e34f3b3448bb82525c7598b56bf5d985957e5085d5472c6" exitCode=0 Dec 06 00:04:58 crc kubenswrapper[4734]: I1206 00:04:58.639924 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p5hlw" event={"ID":"2b49b196-a168-46e0-8267-80a4628cb4cd","Type":"ContainerDied","Data":"e07922991962784b7e34f3b3448bb82525c7598b56bf5d985957e5085d5472c6"} Dec 06 00:04:59 crc kubenswrapper[4734]: I1206 00:04:59.655129 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p5hlw" event={"ID":"2b49b196-a168-46e0-8267-80a4628cb4cd","Type":"ContainerStarted","Data":"f479e61cc318d32b4163b96e904d08d06d5113db823787e0822bf6beafc70af0"} Dec 06 00:05:05 crc kubenswrapper[4734]: I1206 00:05:05.651645 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p5hlw" Dec 06 00:05:05 crc kubenswrapper[4734]: I1206 00:05:05.654220 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p5hlw" Dec 06 00:05:05 crc kubenswrapper[4734]: I1206 00:05:05.716638 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p5hlw" Dec 06 00:05:05 crc kubenswrapper[4734]: I1206 00:05:05.753474 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p5hlw" podStartSLOduration=8.280482418 podStartE2EDuration="10.753447441s" podCreationTimestamp="2025-12-06 00:04:55 +0000 UTC" firstStartedPulling="2025-12-06 00:04:56.617608744 +0000 UTC m=+2717.301013030" lastFinishedPulling="2025-12-06 00:04:59.090573777 +0000 UTC m=+2719.773978053" observedRunningTime="2025-12-06 00:04:59.677422921 +0000 UTC m=+2720.360827197" watchObservedRunningTime="2025-12-06 00:05:05.753447441 +0000 UTC m=+2726.436851717" Dec 06 00:05:05 crc kubenswrapper[4734]: I1206 00:05:05.787320 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p5hlw" Dec 06 00:05:05 crc kubenswrapper[4734]: I1206 00:05:05.970752 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p5hlw"] Dec 06 00:05:07 crc kubenswrapper[4734]: I1206 00:05:07.749662 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p5hlw" podUID="2b49b196-a168-46e0-8267-80a4628cb4cd" containerName="registry-server" containerID="cri-o://f479e61cc318d32b4163b96e904d08d06d5113db823787e0822bf6beafc70af0" gracePeriod=2 Dec 06 00:05:08 crc kubenswrapper[4734]: I1206 00:05:08.257212 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p5hlw" Dec 06 00:05:08 crc kubenswrapper[4734]: I1206 00:05:08.444150 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b49b196-a168-46e0-8267-80a4628cb4cd-utilities\") pod \"2b49b196-a168-46e0-8267-80a4628cb4cd\" (UID: \"2b49b196-a168-46e0-8267-80a4628cb4cd\") " Dec 06 00:05:08 crc kubenswrapper[4734]: I1206 00:05:08.444264 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwlrg\" (UniqueName: \"kubernetes.io/projected/2b49b196-a168-46e0-8267-80a4628cb4cd-kube-api-access-jwlrg\") pod \"2b49b196-a168-46e0-8267-80a4628cb4cd\" (UID: \"2b49b196-a168-46e0-8267-80a4628cb4cd\") " Dec 06 00:05:08 crc kubenswrapper[4734]: I1206 00:05:08.444485 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b49b196-a168-46e0-8267-80a4628cb4cd-catalog-content\") pod \"2b49b196-a168-46e0-8267-80a4628cb4cd\" (UID: \"2b49b196-a168-46e0-8267-80a4628cb4cd\") " Dec 06 00:05:08 crc kubenswrapper[4734]: I1206 00:05:08.445305 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b49b196-a168-46e0-8267-80a4628cb4cd-utilities" (OuterVolumeSpecName: "utilities") pod "2b49b196-a168-46e0-8267-80a4628cb4cd" (UID: "2b49b196-a168-46e0-8267-80a4628cb4cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:05:08 crc kubenswrapper[4734]: I1206 00:05:08.459348 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b49b196-a168-46e0-8267-80a4628cb4cd-kube-api-access-jwlrg" (OuterVolumeSpecName: "kube-api-access-jwlrg") pod "2b49b196-a168-46e0-8267-80a4628cb4cd" (UID: "2b49b196-a168-46e0-8267-80a4628cb4cd"). InnerVolumeSpecName "kube-api-access-jwlrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:05:08 crc kubenswrapper[4734]: I1206 00:05:08.546619 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b49b196-a168-46e0-8267-80a4628cb4cd-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 00:05:08 crc kubenswrapper[4734]: I1206 00:05:08.546668 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwlrg\" (UniqueName: \"kubernetes.io/projected/2b49b196-a168-46e0-8267-80a4628cb4cd-kube-api-access-jwlrg\") on node \"crc\" DevicePath \"\"" Dec 06 00:05:08 crc kubenswrapper[4734]: I1206 00:05:08.552918 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b49b196-a168-46e0-8267-80a4628cb4cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b49b196-a168-46e0-8267-80a4628cb4cd" (UID: "2b49b196-a168-46e0-8267-80a4628cb4cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:05:08 crc kubenswrapper[4734]: I1206 00:05:08.648439 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b49b196-a168-46e0-8267-80a4628cb4cd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 00:05:08 crc kubenswrapper[4734]: I1206 00:05:08.763253 4734 generic.go:334] "Generic (PLEG): container finished" podID="2b49b196-a168-46e0-8267-80a4628cb4cd" containerID="f479e61cc318d32b4163b96e904d08d06d5113db823787e0822bf6beafc70af0" exitCode=0 Dec 06 00:05:08 crc kubenswrapper[4734]: I1206 00:05:08.763354 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p5hlw" event={"ID":"2b49b196-a168-46e0-8267-80a4628cb4cd","Type":"ContainerDied","Data":"f479e61cc318d32b4163b96e904d08d06d5113db823787e0822bf6beafc70af0"} Dec 06 00:05:08 crc kubenswrapper[4734]: I1206 00:05:08.763391 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p5hlw" Dec 06 00:05:08 crc kubenswrapper[4734]: I1206 00:05:08.763888 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p5hlw" event={"ID":"2b49b196-a168-46e0-8267-80a4628cb4cd","Type":"ContainerDied","Data":"996cef42633ab61393e47585f87b89eec26e43a02a9f17c785fa273e40bb7b58"} Dec 06 00:05:08 crc kubenswrapper[4734]: I1206 00:05:08.763925 4734 scope.go:117] "RemoveContainer" containerID="f479e61cc318d32b4163b96e904d08d06d5113db823787e0822bf6beafc70af0" Dec 06 00:05:08 crc kubenswrapper[4734]: I1206 00:05:08.792734 4734 scope.go:117] "RemoveContainer" containerID="e07922991962784b7e34f3b3448bb82525c7598b56bf5d985957e5085d5472c6" Dec 06 00:05:08 crc kubenswrapper[4734]: I1206 00:05:08.821626 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p5hlw"] Dec 06 00:05:08 crc kubenswrapper[4734]: I1206 00:05:08.834471 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p5hlw"] Dec 06 00:05:08 crc kubenswrapper[4734]: I1206 00:05:08.842983 4734 scope.go:117] "RemoveContainer" containerID="c24b6d887b554e89301a6483ae457584b6d65abf19cf417f5efe2f6cd17bb8c0" Dec 06 00:05:08 crc kubenswrapper[4734]: I1206 00:05:08.887677 4734 scope.go:117] "RemoveContainer" containerID="f479e61cc318d32b4163b96e904d08d06d5113db823787e0822bf6beafc70af0" Dec 06 00:05:08 crc kubenswrapper[4734]: E1206 00:05:08.888348 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f479e61cc318d32b4163b96e904d08d06d5113db823787e0822bf6beafc70af0\": container with ID starting with f479e61cc318d32b4163b96e904d08d06d5113db823787e0822bf6beafc70af0 not found: ID does not exist" containerID="f479e61cc318d32b4163b96e904d08d06d5113db823787e0822bf6beafc70af0" Dec 06 00:05:08 crc kubenswrapper[4734]: I1206 00:05:08.888440 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f479e61cc318d32b4163b96e904d08d06d5113db823787e0822bf6beafc70af0"} err="failed to get container status \"f479e61cc318d32b4163b96e904d08d06d5113db823787e0822bf6beafc70af0\": rpc error: code = NotFound desc = could not find container \"f479e61cc318d32b4163b96e904d08d06d5113db823787e0822bf6beafc70af0\": container with ID starting with f479e61cc318d32b4163b96e904d08d06d5113db823787e0822bf6beafc70af0 not found: ID does not exist" Dec 06 00:05:08 crc kubenswrapper[4734]: I1206 00:05:08.888496 4734 scope.go:117] "RemoveContainer" containerID="e07922991962784b7e34f3b3448bb82525c7598b56bf5d985957e5085d5472c6" Dec 06 00:05:08 crc kubenswrapper[4734]: E1206 00:05:08.889031 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e07922991962784b7e34f3b3448bb82525c7598b56bf5d985957e5085d5472c6\": container with ID starting with e07922991962784b7e34f3b3448bb82525c7598b56bf5d985957e5085d5472c6 not found: ID does not exist" containerID="e07922991962784b7e34f3b3448bb82525c7598b56bf5d985957e5085d5472c6" Dec 06 00:05:08 crc kubenswrapper[4734]: I1206 00:05:08.889082 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e07922991962784b7e34f3b3448bb82525c7598b56bf5d985957e5085d5472c6"} err="failed to get container status \"e07922991962784b7e34f3b3448bb82525c7598b56bf5d985957e5085d5472c6\": rpc error: code = NotFound desc = could not find container \"e07922991962784b7e34f3b3448bb82525c7598b56bf5d985957e5085d5472c6\": container with ID starting with e07922991962784b7e34f3b3448bb82525c7598b56bf5d985957e5085d5472c6 not found: ID does not exist" Dec 06 00:05:08 crc kubenswrapper[4734]: I1206 00:05:08.889119 4734 scope.go:117] "RemoveContainer" containerID="c24b6d887b554e89301a6483ae457584b6d65abf19cf417f5efe2f6cd17bb8c0" Dec 06 00:05:08 crc kubenswrapper[4734]: E1206 00:05:08.889476 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c24b6d887b554e89301a6483ae457584b6d65abf19cf417f5efe2f6cd17bb8c0\": container with ID starting with c24b6d887b554e89301a6483ae457584b6d65abf19cf417f5efe2f6cd17bb8c0 not found: ID does not exist" containerID="c24b6d887b554e89301a6483ae457584b6d65abf19cf417f5efe2f6cd17bb8c0" Dec 06 00:05:08 crc kubenswrapper[4734]: I1206 00:05:08.889580 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c24b6d887b554e89301a6483ae457584b6d65abf19cf417f5efe2f6cd17bb8c0"} err="failed to get container status \"c24b6d887b554e89301a6483ae457584b6d65abf19cf417f5efe2f6cd17bb8c0\": rpc error: code = NotFound desc = could not find container \"c24b6d887b554e89301a6483ae457584b6d65abf19cf417f5efe2f6cd17bb8c0\": container with ID starting with c24b6d887b554e89301a6483ae457584b6d65abf19cf417f5efe2f6cd17bb8c0 not found: ID does not exist" Dec 06 00:05:09 crc kubenswrapper[4734]: I1206 00:05:09.629262 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b49b196-a168-46e0-8267-80a4628cb4cd" path="/var/lib/kubelet/pods/2b49b196-a168-46e0-8267-80a4628cb4cd/volumes" Dec 06 00:05:20 crc kubenswrapper[4734]: I1206 00:05:20.444581 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 00:05:20 crc kubenswrapper[4734]: I1206 00:05:20.445574 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 00:05:50 crc kubenswrapper[4734]: I1206 00:05:50.445193 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 00:05:50 crc kubenswrapper[4734]: I1206 00:05:50.446140 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 00:06:10 crc kubenswrapper[4734]: I1206 00:06:10.710297 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fnjwq"] Dec 06 00:06:10 crc kubenswrapper[4734]: E1206 00:06:10.711567 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b49b196-a168-46e0-8267-80a4628cb4cd" containerName="extract-content" Dec 06 00:06:10 crc kubenswrapper[4734]: I1206 00:06:10.711587 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b49b196-a168-46e0-8267-80a4628cb4cd" containerName="extract-content" Dec 06 00:06:10 crc kubenswrapper[4734]: E1206 00:06:10.711637 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b49b196-a168-46e0-8267-80a4628cb4cd" containerName="extract-utilities" Dec 06 00:06:10 crc kubenswrapper[4734]: I1206 00:06:10.711648 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b49b196-a168-46e0-8267-80a4628cb4cd" containerName="extract-utilities" Dec 06 00:06:10 crc kubenswrapper[4734]: E1206 00:06:10.711666 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b49b196-a168-46e0-8267-80a4628cb4cd" containerName="registry-server" Dec 06 00:06:10 crc kubenswrapper[4734]: I1206 00:06:10.711674 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b49b196-a168-46e0-8267-80a4628cb4cd" containerName="registry-server" Dec 06 00:06:10 crc kubenswrapper[4734]: I1206 00:06:10.711931 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b49b196-a168-46e0-8267-80a4628cb4cd" containerName="registry-server" Dec 06 00:06:10 crc kubenswrapper[4734]: I1206 00:06:10.714032 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fnjwq" Dec 06 00:06:10 crc kubenswrapper[4734]: I1206 00:06:10.723649 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fnjwq"] Dec 06 00:06:10 crc kubenswrapper[4734]: I1206 00:06:10.855218 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89aa4ce4-ff20-4e06-b447-b1bce857e5d2-utilities\") pod \"redhat-marketplace-fnjwq\" (UID: \"89aa4ce4-ff20-4e06-b447-b1bce857e5d2\") " pod="openshift-marketplace/redhat-marketplace-fnjwq" Dec 06 00:06:10 crc kubenswrapper[4734]: I1206 00:06:10.855404 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89aa4ce4-ff20-4e06-b447-b1bce857e5d2-catalog-content\") pod \"redhat-marketplace-fnjwq\" (UID: \"89aa4ce4-ff20-4e06-b447-b1bce857e5d2\") " pod="openshift-marketplace/redhat-marketplace-fnjwq" Dec 06 00:06:10 crc kubenswrapper[4734]: I1206 00:06:10.855520 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jvs4\" (UniqueName: \"kubernetes.io/projected/89aa4ce4-ff20-4e06-b447-b1bce857e5d2-kube-api-access-5jvs4\") pod \"redhat-marketplace-fnjwq\" (UID: \"89aa4ce4-ff20-4e06-b447-b1bce857e5d2\") " pod="openshift-marketplace/redhat-marketplace-fnjwq" Dec 06 00:06:10 crc kubenswrapper[4734]: I1206 00:06:10.958319 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89aa4ce4-ff20-4e06-b447-b1bce857e5d2-utilities\") pod \"redhat-marketplace-fnjwq\" (UID: \"89aa4ce4-ff20-4e06-b447-b1bce857e5d2\") " pod="openshift-marketplace/redhat-marketplace-fnjwq" Dec 06 00:06:10 crc kubenswrapper[4734]: I1206 00:06:10.959010 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89aa4ce4-ff20-4e06-b447-b1bce857e5d2-utilities\") pod \"redhat-marketplace-fnjwq\" (UID: \"89aa4ce4-ff20-4e06-b447-b1bce857e5d2\") " pod="openshift-marketplace/redhat-marketplace-fnjwq" Dec 06 00:06:10 crc kubenswrapper[4734]: I1206 00:06:10.959073 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89aa4ce4-ff20-4e06-b447-b1bce857e5d2-catalog-content\") pod \"redhat-marketplace-fnjwq\" (UID: \"89aa4ce4-ff20-4e06-b447-b1bce857e5d2\") " pod="openshift-marketplace/redhat-marketplace-fnjwq" Dec 06 00:06:10 crc kubenswrapper[4734]: I1206 00:06:10.959255 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jvs4\" (UniqueName: \"kubernetes.io/projected/89aa4ce4-ff20-4e06-b447-b1bce857e5d2-kube-api-access-5jvs4\") pod \"redhat-marketplace-fnjwq\" (UID: \"89aa4ce4-ff20-4e06-b447-b1bce857e5d2\") " pod="openshift-marketplace/redhat-marketplace-fnjwq" Dec 06 00:06:10 crc kubenswrapper[4734]: I1206 00:06:10.963227 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89aa4ce4-ff20-4e06-b447-b1bce857e5d2-catalog-content\") pod \"redhat-marketplace-fnjwq\" (UID: \"89aa4ce4-ff20-4e06-b447-b1bce857e5d2\") " pod="openshift-marketplace/redhat-marketplace-fnjwq" Dec 06 00:06:10 crc kubenswrapper[4734]: I1206 00:06:10.990364 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jvs4\" (UniqueName: \"kubernetes.io/projected/89aa4ce4-ff20-4e06-b447-b1bce857e5d2-kube-api-access-5jvs4\") pod \"redhat-marketplace-fnjwq\" (UID: \"89aa4ce4-ff20-4e06-b447-b1bce857e5d2\") " pod="openshift-marketplace/redhat-marketplace-fnjwq" Dec 06 00:06:11 crc kubenswrapper[4734]: I1206 00:06:11.041296 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fnjwq" Dec 06 00:06:11 crc kubenswrapper[4734]: I1206 00:06:11.579939 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fnjwq"] Dec 06 00:06:12 crc kubenswrapper[4734]: I1206 00:06:12.482689 4734 generic.go:334] "Generic (PLEG): container finished" podID="89aa4ce4-ff20-4e06-b447-b1bce857e5d2" containerID="45c04bc165c974d93b9dcc784aee79fc3024fa87d901124ef4781cdaa50e4d3f" exitCode=0 Dec 06 00:06:12 crc kubenswrapper[4734]: I1206 00:06:12.482793 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fnjwq" event={"ID":"89aa4ce4-ff20-4e06-b447-b1bce857e5d2","Type":"ContainerDied","Data":"45c04bc165c974d93b9dcc784aee79fc3024fa87d901124ef4781cdaa50e4d3f"} Dec 06 00:06:12 crc kubenswrapper[4734]: I1206 00:06:12.483282 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fnjwq" event={"ID":"89aa4ce4-ff20-4e06-b447-b1bce857e5d2","Type":"ContainerStarted","Data":"4d45ba818f9a7b1d261f77cfa93d1cbac862d07ef6b2f9996436adf2c272c5a9"} Dec 06 00:06:14 crc kubenswrapper[4734]: I1206 00:06:14.508132 4734 generic.go:334] "Generic (PLEG): container finished" podID="89aa4ce4-ff20-4e06-b447-b1bce857e5d2" containerID="7bbb635fb25b477d2388eca9fc605d6bda69e9a25173d8fead36d59e12866f7b" exitCode=0 Dec 06 00:06:14 crc kubenswrapper[4734]: I1206 00:06:14.508271 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fnjwq" event={"ID":"89aa4ce4-ff20-4e06-b447-b1bce857e5d2","Type":"ContainerDied","Data":"7bbb635fb25b477d2388eca9fc605d6bda69e9a25173d8fead36d59e12866f7b"} Dec 06 00:06:15 crc kubenswrapper[4734]: I1206 00:06:15.521385 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fnjwq" event={"ID":"89aa4ce4-ff20-4e06-b447-b1bce857e5d2","Type":"ContainerStarted","Data":"f7c751fc6d2c33df34e24205a74c16b67db687d17cff6315f7786f1d583c7829"} Dec 06 00:06:15 crc kubenswrapper[4734]: I1206 00:06:15.554201 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fnjwq" podStartSLOduration=3.062304194 podStartE2EDuration="5.554172007s" podCreationTimestamp="2025-12-06 00:06:10 +0000 UTC" firstStartedPulling="2025-12-06 00:06:12.484978541 +0000 UTC m=+2793.168382817" lastFinishedPulling="2025-12-06 00:06:14.976846354 +0000 UTC m=+2795.660250630" observedRunningTime="2025-12-06 00:06:15.547815041 +0000 UTC m=+2796.231219317" watchObservedRunningTime="2025-12-06 00:06:15.554172007 +0000 UTC m=+2796.237576283" Dec 06 00:06:20 crc kubenswrapper[4734]: I1206 00:06:20.444513 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 00:06:20 crc kubenswrapper[4734]: I1206 00:06:20.445490 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 00:06:20 crc kubenswrapper[4734]: I1206 00:06:20.445561 4734 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 06 00:06:20 crc kubenswrapper[4734]: I1206 00:06:20.446581 4734 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4163ec915cc478d242c1011fc445081e62cb49bf78f698e993005b968d9348bf"} pod="openshift-machine-config-operator/machine-config-daemon-vn94d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 00:06:20 crc kubenswrapper[4734]: I1206 00:06:20.446650 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" containerID="cri-o://4163ec915cc478d242c1011fc445081e62cb49bf78f698e993005b968d9348bf" gracePeriod=600 Dec 06 00:06:21 crc kubenswrapper[4734]: I1206 00:06:21.042295 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fnjwq" Dec 06 00:06:21 crc kubenswrapper[4734]: I1206 00:06:21.042834 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fnjwq" Dec 06 00:06:21 crc kubenswrapper[4734]: I1206 00:06:21.097921 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fnjwq" Dec 06 00:06:21 crc kubenswrapper[4734]: I1206 00:06:21.603329 4734 generic.go:334] "Generic (PLEG): container finished" podID="65758270-a7a7-46b5-af95-0588daf9fa86" containerID="4163ec915cc478d242c1011fc445081e62cb49bf78f698e993005b968d9348bf" exitCode=0 Dec 06 00:06:21 crc kubenswrapper[4734]: I1206 00:06:21.603760 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerDied","Data":"4163ec915cc478d242c1011fc445081e62cb49bf78f698e993005b968d9348bf"} Dec 06 00:06:21 crc kubenswrapper[4734]: I1206 00:06:21.603810 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerStarted","Data":"22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b"} Dec 06 00:06:21 crc kubenswrapper[4734]: I1206 00:06:21.603838 4734 scope.go:117] "RemoveContainer" containerID="c1125a47316243dbc8e4b9f56d99d1db26d491a48005ab2e218e005031c75762" Dec 06 00:06:21 crc kubenswrapper[4734]: I1206 00:06:21.685049 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fnjwq" Dec 06 00:06:21 crc kubenswrapper[4734]: I1206 00:06:21.747295 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fnjwq"] Dec 06 00:06:23 crc kubenswrapper[4734]: I1206 00:06:23.636964 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fnjwq" podUID="89aa4ce4-ff20-4e06-b447-b1bce857e5d2" containerName="registry-server" containerID="cri-o://f7c751fc6d2c33df34e24205a74c16b67db687d17cff6315f7786f1d583c7829" gracePeriod=2 Dec 06 00:06:24 crc kubenswrapper[4734]: I1206 00:06:24.254483 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fnjwq" Dec 06 00:06:24 crc kubenswrapper[4734]: I1206 00:06:24.367506 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89aa4ce4-ff20-4e06-b447-b1bce857e5d2-catalog-content\") pod \"89aa4ce4-ff20-4e06-b447-b1bce857e5d2\" (UID: \"89aa4ce4-ff20-4e06-b447-b1bce857e5d2\") " Dec 06 00:06:24 crc kubenswrapper[4734]: I1206 00:06:24.368112 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jvs4\" (UniqueName: \"kubernetes.io/projected/89aa4ce4-ff20-4e06-b447-b1bce857e5d2-kube-api-access-5jvs4\") pod \"89aa4ce4-ff20-4e06-b447-b1bce857e5d2\" (UID: \"89aa4ce4-ff20-4e06-b447-b1bce857e5d2\") " Dec 06 00:06:24 crc kubenswrapper[4734]: I1206 00:06:24.368349 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89aa4ce4-ff20-4e06-b447-b1bce857e5d2-utilities\") pod \"89aa4ce4-ff20-4e06-b447-b1bce857e5d2\" (UID: \"89aa4ce4-ff20-4e06-b447-b1bce857e5d2\") " Dec 06 00:06:24 crc kubenswrapper[4734]: I1206 00:06:24.369373 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89aa4ce4-ff20-4e06-b447-b1bce857e5d2-utilities" (OuterVolumeSpecName: "utilities") pod "89aa4ce4-ff20-4e06-b447-b1bce857e5d2" (UID: "89aa4ce4-ff20-4e06-b447-b1bce857e5d2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:06:24 crc kubenswrapper[4734]: I1206 00:06:24.377791 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89aa4ce4-ff20-4e06-b447-b1bce857e5d2-kube-api-access-5jvs4" (OuterVolumeSpecName: "kube-api-access-5jvs4") pod "89aa4ce4-ff20-4e06-b447-b1bce857e5d2" (UID: "89aa4ce4-ff20-4e06-b447-b1bce857e5d2"). InnerVolumeSpecName "kube-api-access-5jvs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:06:24 crc kubenswrapper[4734]: I1206 00:06:24.397963 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89aa4ce4-ff20-4e06-b447-b1bce857e5d2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89aa4ce4-ff20-4e06-b447-b1bce857e5d2" (UID: "89aa4ce4-ff20-4e06-b447-b1bce857e5d2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:06:24 crc kubenswrapper[4734]: I1206 00:06:24.471563 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jvs4\" (UniqueName: \"kubernetes.io/projected/89aa4ce4-ff20-4e06-b447-b1bce857e5d2-kube-api-access-5jvs4\") on node \"crc\" DevicePath \"\"" Dec 06 00:06:24 crc kubenswrapper[4734]: I1206 00:06:24.471625 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89aa4ce4-ff20-4e06-b447-b1bce857e5d2-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 00:06:24 crc kubenswrapper[4734]: I1206 00:06:24.471643 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89aa4ce4-ff20-4e06-b447-b1bce857e5d2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 00:06:24 crc kubenswrapper[4734]: I1206 00:06:24.651963 4734 generic.go:334] "Generic (PLEG): container finished" podID="89aa4ce4-ff20-4e06-b447-b1bce857e5d2" containerID="f7c751fc6d2c33df34e24205a74c16b67db687d17cff6315f7786f1d583c7829" exitCode=0 Dec 06 00:06:24 crc kubenswrapper[4734]: I1206 00:06:24.652039 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fnjwq" Dec 06 00:06:24 crc kubenswrapper[4734]: I1206 00:06:24.652030 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fnjwq" event={"ID":"89aa4ce4-ff20-4e06-b447-b1bce857e5d2","Type":"ContainerDied","Data":"f7c751fc6d2c33df34e24205a74c16b67db687d17cff6315f7786f1d583c7829"} Dec 06 00:06:24 crc kubenswrapper[4734]: I1206 00:06:24.652686 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fnjwq" event={"ID":"89aa4ce4-ff20-4e06-b447-b1bce857e5d2","Type":"ContainerDied","Data":"4d45ba818f9a7b1d261f77cfa93d1cbac862d07ef6b2f9996436adf2c272c5a9"} Dec 06 00:06:24 crc kubenswrapper[4734]: I1206 00:06:24.652724 4734 scope.go:117] "RemoveContainer" containerID="f7c751fc6d2c33df34e24205a74c16b67db687d17cff6315f7786f1d583c7829" Dec 06 00:06:24 crc kubenswrapper[4734]: I1206 00:06:24.683310 4734 scope.go:117] "RemoveContainer" containerID="7bbb635fb25b477d2388eca9fc605d6bda69e9a25173d8fead36d59e12866f7b" Dec 06 00:06:24 crc kubenswrapper[4734]: I1206 00:06:24.714992 4734 scope.go:117] "RemoveContainer" containerID="45c04bc165c974d93b9dcc784aee79fc3024fa87d901124ef4781cdaa50e4d3f" Dec 06 00:06:24 crc kubenswrapper[4734]: I1206 00:06:24.719621 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fnjwq"] Dec 06 00:06:24 crc kubenswrapper[4734]: I1206 00:06:24.767138 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fnjwq"] Dec 06 00:06:24 crc kubenswrapper[4734]: I1206 00:06:24.782168 4734 scope.go:117] "RemoveContainer" containerID="f7c751fc6d2c33df34e24205a74c16b67db687d17cff6315f7786f1d583c7829" Dec 06 00:06:24 crc kubenswrapper[4734]: E1206 00:06:24.782810 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7c751fc6d2c33df34e24205a74c16b67db687d17cff6315f7786f1d583c7829\": container with ID starting with f7c751fc6d2c33df34e24205a74c16b67db687d17cff6315f7786f1d583c7829 not found: ID does not exist" containerID="f7c751fc6d2c33df34e24205a74c16b67db687d17cff6315f7786f1d583c7829" Dec 06 00:06:24 crc kubenswrapper[4734]: I1206 00:06:24.782890 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7c751fc6d2c33df34e24205a74c16b67db687d17cff6315f7786f1d583c7829"} err="failed to get container status \"f7c751fc6d2c33df34e24205a74c16b67db687d17cff6315f7786f1d583c7829\": rpc error: code = NotFound desc = could not find container \"f7c751fc6d2c33df34e24205a74c16b67db687d17cff6315f7786f1d583c7829\": container with ID starting with f7c751fc6d2c33df34e24205a74c16b67db687d17cff6315f7786f1d583c7829 not found: ID does not exist" Dec 06 00:06:24 crc kubenswrapper[4734]: I1206 00:06:24.782926 4734 scope.go:117] "RemoveContainer" containerID="7bbb635fb25b477d2388eca9fc605d6bda69e9a25173d8fead36d59e12866f7b" Dec 06 00:06:24 crc kubenswrapper[4734]: E1206 00:06:24.783424 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bbb635fb25b477d2388eca9fc605d6bda69e9a25173d8fead36d59e12866f7b\": container with ID starting with 7bbb635fb25b477d2388eca9fc605d6bda69e9a25173d8fead36d59e12866f7b not found: ID does not exist" containerID="7bbb635fb25b477d2388eca9fc605d6bda69e9a25173d8fead36d59e12866f7b" Dec 06 00:06:24 crc kubenswrapper[4734]: I1206 00:06:24.783451 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bbb635fb25b477d2388eca9fc605d6bda69e9a25173d8fead36d59e12866f7b"} err="failed to get container status \"7bbb635fb25b477d2388eca9fc605d6bda69e9a25173d8fead36d59e12866f7b\": rpc error: code = NotFound desc = could not find container \"7bbb635fb25b477d2388eca9fc605d6bda69e9a25173d8fead36d59e12866f7b\": container with ID starting with 7bbb635fb25b477d2388eca9fc605d6bda69e9a25173d8fead36d59e12866f7b not found: ID does not exist" Dec 06 00:06:24 crc kubenswrapper[4734]: I1206 00:06:24.783466 4734 scope.go:117] "RemoveContainer" containerID="45c04bc165c974d93b9dcc784aee79fc3024fa87d901124ef4781cdaa50e4d3f" Dec 06 00:06:24 crc kubenswrapper[4734]: E1206 00:06:24.783913 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45c04bc165c974d93b9dcc784aee79fc3024fa87d901124ef4781cdaa50e4d3f\": container with ID starting with 45c04bc165c974d93b9dcc784aee79fc3024fa87d901124ef4781cdaa50e4d3f not found: ID does not exist" containerID="45c04bc165c974d93b9dcc784aee79fc3024fa87d901124ef4781cdaa50e4d3f" Dec 06 00:06:24 crc kubenswrapper[4734]: I1206 00:06:24.784016 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45c04bc165c974d93b9dcc784aee79fc3024fa87d901124ef4781cdaa50e4d3f"} err="failed to get container status \"45c04bc165c974d93b9dcc784aee79fc3024fa87d901124ef4781cdaa50e4d3f\": rpc error: code = NotFound desc = could not find container \"45c04bc165c974d93b9dcc784aee79fc3024fa87d901124ef4781cdaa50e4d3f\": container with ID starting with 45c04bc165c974d93b9dcc784aee79fc3024fa87d901124ef4781cdaa50e4d3f not found: ID does not exist" Dec 06 00:06:24 crc kubenswrapper[4734]: E1206 00:06:24.822738 4734 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89aa4ce4_ff20_4e06_b447_b1bce857e5d2.slice/crio-4d45ba818f9a7b1d261f77cfa93d1cbac862d07ef6b2f9996436adf2c272c5a9\": RecentStats: unable to find data in memory cache]" Dec 06 00:06:25 crc kubenswrapper[4734]: I1206 00:06:25.627075 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89aa4ce4-ff20-4e06-b447-b1bce857e5d2" path="/var/lib/kubelet/pods/89aa4ce4-ff20-4e06-b447-b1bce857e5d2/volumes" Dec 06 00:07:06 crc kubenswrapper[4734]: I1206 00:07:06.084235 4734 generic.go:334] "Generic (PLEG): container finished" podID="039811b0-a938-445d-b5a4-702b526f8356" containerID="beca813844c257715680290ba8ebf794fa607f38b784740e93669e632d8255fb" exitCode=0 Dec 06 00:07:06 crc kubenswrapper[4734]: I1206 00:07:06.084348 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" event={"ID":"039811b0-a938-445d-b5a4-702b526f8356","Type":"ContainerDied","Data":"beca813844c257715680290ba8ebf794fa607f38b784740e93669e632d8255fb"} Dec 06 00:07:07 crc kubenswrapper[4734]: I1206 00:07:07.543518 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" Dec 06 00:07:07 crc kubenswrapper[4734]: I1206 00:07:07.699437 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-telemetry-combined-ca-bundle\") pod \"039811b0-a938-445d-b5a4-702b526f8356\" (UID: \"039811b0-a938-445d-b5a4-702b526f8356\") " Dec 06 00:07:07 crc kubenswrapper[4734]: I1206 00:07:07.699505 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-inventory\") pod \"039811b0-a938-445d-b5a4-702b526f8356\" (UID: \"039811b0-a938-445d-b5a4-702b526f8356\") " Dec 06 00:07:07 crc kubenswrapper[4734]: I1206 00:07:07.699633 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-ceilometer-compute-config-data-1\") pod \"039811b0-a938-445d-b5a4-702b526f8356\" (UID: \"039811b0-a938-445d-b5a4-702b526f8356\") " Dec 06 00:07:07 crc kubenswrapper[4734]: I1206 00:07:07.699676 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-ceilometer-compute-config-data-0\") pod \"039811b0-a938-445d-b5a4-702b526f8356\" (UID: \"039811b0-a938-445d-b5a4-702b526f8356\") " Dec 06 00:07:07 crc kubenswrapper[4734]: I1206 00:07:07.699771 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wt5x\" (UniqueName: \"kubernetes.io/projected/039811b0-a938-445d-b5a4-702b526f8356-kube-api-access-9wt5x\") pod \"039811b0-a938-445d-b5a4-702b526f8356\" (UID: \"039811b0-a938-445d-b5a4-702b526f8356\") " Dec 06 00:07:07 crc kubenswrapper[4734]: I1206 00:07:07.699877 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-ssh-key\") pod \"039811b0-a938-445d-b5a4-702b526f8356\" (UID: \"039811b0-a938-445d-b5a4-702b526f8356\") " Dec 06 00:07:07 crc kubenswrapper[4734]: I1206 00:07:07.699912 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-ceilometer-compute-config-data-2\") pod \"039811b0-a938-445d-b5a4-702b526f8356\" (UID: \"039811b0-a938-445d-b5a4-702b526f8356\") " Dec 06 00:07:07 crc kubenswrapper[4734]: I1206 00:07:07.708643 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/039811b0-a938-445d-b5a4-702b526f8356-kube-api-access-9wt5x" (OuterVolumeSpecName: "kube-api-access-9wt5x") pod "039811b0-a938-445d-b5a4-702b526f8356" (UID: "039811b0-a938-445d-b5a4-702b526f8356"). InnerVolumeSpecName "kube-api-access-9wt5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:07:07 crc kubenswrapper[4734]: I1206 00:07:07.709033 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "039811b0-a938-445d-b5a4-702b526f8356" (UID: "039811b0-a938-445d-b5a4-702b526f8356"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:07:07 crc kubenswrapper[4734]: I1206 00:07:07.732743 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "039811b0-a938-445d-b5a4-702b526f8356" (UID: "039811b0-a938-445d-b5a4-702b526f8356"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:07:07 crc kubenswrapper[4734]: I1206 00:07:07.734128 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "039811b0-a938-445d-b5a4-702b526f8356" (UID: "039811b0-a938-445d-b5a4-702b526f8356"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:07:07 crc kubenswrapper[4734]: I1206 00:07:07.735690 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "039811b0-a938-445d-b5a4-702b526f8356" (UID: "039811b0-a938-445d-b5a4-702b526f8356"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:07:07 crc kubenswrapper[4734]: I1206 00:07:07.736251 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "039811b0-a938-445d-b5a4-702b526f8356" (UID: "039811b0-a938-445d-b5a4-702b526f8356"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:07:07 crc kubenswrapper[4734]: I1206 00:07:07.737763 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-inventory" (OuterVolumeSpecName: "inventory") pod "039811b0-a938-445d-b5a4-702b526f8356" (UID: "039811b0-a938-445d-b5a4-702b526f8356"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:07:07 crc kubenswrapper[4734]: I1206 00:07:07.802662 4734 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 06 00:07:07 crc kubenswrapper[4734]: I1206 00:07:07.802695 4734 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-inventory\") on node \"crc\" DevicePath \"\"" Dec 06 00:07:07 crc kubenswrapper[4734]: I1206 00:07:07.802707 4734 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 06 00:07:07 crc kubenswrapper[4734]: I1206 00:07:07.802721 4734 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 06 00:07:07 crc kubenswrapper[4734]: I1206 00:07:07.802737 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wt5x\" (UniqueName: \"kubernetes.io/projected/039811b0-a938-445d-b5a4-702b526f8356-kube-api-access-9wt5x\") on node \"crc\" DevicePath \"\"" Dec 06 00:07:07 crc kubenswrapper[4734]: I1206 00:07:07.802751 4734 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 00:07:07 crc kubenswrapper[4734]: I1206 00:07:07.802773 4734 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/039811b0-a938-445d-b5a4-702b526f8356-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 06 00:07:08 crc kubenswrapper[4734]: I1206 00:07:08.108488 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" event={"ID":"039811b0-a938-445d-b5a4-702b526f8356","Type":"ContainerDied","Data":"d48a5e73af1a73a8ade5764006281871484b6b75ad4752bf32a501c7e01a5b41"} Dec 06 00:07:08 crc kubenswrapper[4734]: I1206 00:07:08.108938 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d48a5e73af1a73a8ade5764006281871484b6b75ad4752bf32a501c7e01a5b41" Dec 06 00:07:08 crc kubenswrapper[4734]: I1206 00:07:08.108875 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.131979 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 06 00:08:03 crc kubenswrapper[4734]: E1206 00:08:03.133504 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89aa4ce4-ff20-4e06-b447-b1bce857e5d2" containerName="extract-utilities" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.133541 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="89aa4ce4-ff20-4e06-b447-b1bce857e5d2" containerName="extract-utilities" Dec 06 00:08:03 crc kubenswrapper[4734]: E1206 00:08:03.133560 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89aa4ce4-ff20-4e06-b447-b1bce857e5d2" containerName="extract-content" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.133567 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="89aa4ce4-ff20-4e06-b447-b1bce857e5d2" containerName="extract-content" Dec 06 00:08:03 crc kubenswrapper[4734]: E1206 00:08:03.133582 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="039811b0-a938-445d-b5a4-702b526f8356" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.133595 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="039811b0-a938-445d-b5a4-702b526f8356" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 06 00:08:03 crc kubenswrapper[4734]: E1206 00:08:03.133614 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89aa4ce4-ff20-4e06-b447-b1bce857e5d2" containerName="registry-server" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.133620 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="89aa4ce4-ff20-4e06-b447-b1bce857e5d2" containerName="registry-server" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.133824 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="039811b0-a938-445d-b5a4-702b526f8356" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.133845 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="89aa4ce4-ff20-4e06-b447-b1bce857e5d2" containerName="registry-server" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.134839 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.140086 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.140384 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.140255 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.140354 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-ccp8x" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.140978 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.232289 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d24dfd1-9ec6-4419-84c6-577deb60b95f-config-data\") pod \"tempest-tests-tempest\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.232510 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5d24dfd1-9ec6-4419-84c6-577deb60b95f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.233201 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5d24dfd1-9ec6-4419-84c6-577deb60b95f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.335922 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5d24dfd1-9ec6-4419-84c6-577deb60b95f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.336011 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d24dfd1-9ec6-4419-84c6-577deb60b95f-config-data\") pod \"tempest-tests-tempest\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.336078 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5d24dfd1-9ec6-4419-84c6-577deb60b95f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.336120 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d24dfd1-9ec6-4419-84c6-577deb60b95f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.336142 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5d24dfd1-9ec6-4419-84c6-577deb60b95f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.336182 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bw9z\" (UniqueName: \"kubernetes.io/projected/5d24dfd1-9ec6-4419-84c6-577deb60b95f-kube-api-access-2bw9z\") pod \"tempest-tests-tempest\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.336235 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5d24dfd1-9ec6-4419-84c6-577deb60b95f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.336261 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5d24dfd1-9ec6-4419-84c6-577deb60b95f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.336813 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.338161 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d24dfd1-9ec6-4419-84c6-577deb60b95f-config-data\") pod \"tempest-tests-tempest\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.338828 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5d24dfd1-9ec6-4419-84c6-577deb60b95f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.344935 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5d24dfd1-9ec6-4419-84c6-577deb60b95f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.438876 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5d24dfd1-9ec6-4419-84c6-577deb60b95f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.438964 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d24dfd1-9ec6-4419-84c6-577deb60b95f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.439016 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bw9z\" (UniqueName: \"kubernetes.io/projected/5d24dfd1-9ec6-4419-84c6-577deb60b95f-kube-api-access-2bw9z\") pod \"tempest-tests-tempest\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.439044 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5d24dfd1-9ec6-4419-84c6-577deb60b95f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.439100 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.439125 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5d24dfd1-9ec6-4419-84c6-577deb60b95f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.439735 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5d24dfd1-9ec6-4419-84c6-577deb60b95f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.439849 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5d24dfd1-9ec6-4419-84c6-577deb60b95f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.440103 4734 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.448058 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d24dfd1-9ec6-4419-84c6-577deb60b95f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.454298 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5d24dfd1-9ec6-4419-84c6-577deb60b95f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.461082 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bw9z\" (UniqueName: \"kubernetes.io/projected/5d24dfd1-9ec6-4419-84c6-577deb60b95f-kube-api-access-2bw9z\") pod \"tempest-tests-tempest\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.477660 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " pod="openstack/tempest-tests-tempest" Dec 06 00:08:03 crc kubenswrapper[4734]: I1206 00:08:03.773821 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 06 00:08:04 crc kubenswrapper[4734]: I1206 00:08:04.256660 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 06 00:08:04 crc kubenswrapper[4734]: I1206 00:08:04.711334 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5d24dfd1-9ec6-4419-84c6-577deb60b95f","Type":"ContainerStarted","Data":"f61033f57cc009ff380a8bcc069bd70943668c113ca81b22873ac528d506e8a3"} Dec 06 00:08:20 crc kubenswrapper[4734]: I1206 00:08:20.445844 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 00:08:20 crc kubenswrapper[4734]: I1206 00:08:20.446648 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 00:08:43 crc kubenswrapper[4734]: E1206 00:08:43.349982 4734 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 06 00:08:43 crc kubenswrapper[4734]: E1206 00:08:43.350928 4734 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2bw9z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(5d24dfd1-9ec6-4419-84c6-577deb60b95f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 06 00:08:43 crc kubenswrapper[4734]: E1206 00:08:43.352170 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="5d24dfd1-9ec6-4419-84c6-577deb60b95f" Dec 06 00:08:44 crc kubenswrapper[4734]: E1206 00:08:44.164924 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="5d24dfd1-9ec6-4419-84c6-577deb60b95f" Dec 06 00:08:50 crc kubenswrapper[4734]: I1206 00:08:50.445133 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 00:08:50 crc kubenswrapper[4734]: I1206 00:08:50.446058 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 00:08:57 crc kubenswrapper[4734]: I1206 00:08:57.226307 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 06 00:08:58 crc kubenswrapper[4734]: I1206 00:08:58.331559 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5d24dfd1-9ec6-4419-84c6-577deb60b95f","Type":"ContainerStarted","Data":"b44b03c27e1d40353aa910c6a11ab1d2aa1041ef998f377eb6d38c6d268f18cb"} Dec 06 00:08:58 crc kubenswrapper[4734]: I1206 00:08:58.376087 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.41934101 podStartE2EDuration="56.376058045s" podCreationTimestamp="2025-12-06 00:08:02 +0000 UTC" firstStartedPulling="2025-12-06 00:08:04.266212501 +0000 UTC m=+2904.949616777" lastFinishedPulling="2025-12-06 00:08:57.222929536 +0000 UTC m=+2957.906333812" observedRunningTime="2025-12-06 00:08:58.361231133 +0000 UTC m=+2959.044635409" watchObservedRunningTime="2025-12-06 00:08:58.376058045 +0000 UTC m=+2959.059462321" Dec 06 00:09:20 crc kubenswrapper[4734]: I1206 00:09:20.445980 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 00:09:20 crc kubenswrapper[4734]: I1206 00:09:20.446852 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 00:09:20 crc kubenswrapper[4734]: I1206 00:09:20.446905 4734 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 06 00:09:20 crc kubenswrapper[4734]: I1206 00:09:20.447452 4734 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b"} pod="openshift-machine-config-operator/machine-config-daemon-vn94d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 00:09:20 crc kubenswrapper[4734]: I1206 00:09:20.447503 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" containerID="cri-o://22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b" gracePeriod=600 Dec 06 00:09:20 crc kubenswrapper[4734]: E1206 00:09:20.579649 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:09:21 crc kubenswrapper[4734]: I1206 00:09:21.576571 4734 generic.go:334] "Generic (PLEG): container finished" podID="65758270-a7a7-46b5-af95-0588daf9fa86" containerID="22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b" exitCode=0 Dec 06 00:09:21 crc kubenswrapper[4734]: I1206 00:09:21.576738 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerDied","Data":"22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b"} Dec 06 00:09:21 crc kubenswrapper[4734]: I1206 00:09:21.576929 4734 scope.go:117] "RemoveContainer" containerID="4163ec915cc478d242c1011fc445081e62cb49bf78f698e993005b968d9348bf" Dec 06 00:09:21 crc kubenswrapper[4734]: I1206 00:09:21.577729 4734 scope.go:117] "RemoveContainer" containerID="22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b" Dec 06 00:09:21 crc kubenswrapper[4734]: E1206 00:09:21.577960 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:09:33 crc kubenswrapper[4734]: I1206 00:09:33.614164 4734 scope.go:117] "RemoveContainer" containerID="22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b" Dec 06 00:09:33 crc kubenswrapper[4734]: E1206 00:09:33.615461 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:09:47 crc kubenswrapper[4734]: I1206 00:09:47.615056 4734 scope.go:117] "RemoveContainer" containerID="22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b" Dec 06 00:09:47 crc kubenswrapper[4734]: E1206 00:09:47.616261 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:09:59 crc kubenswrapper[4734]: I1206 00:09:59.629221 4734 scope.go:117] "RemoveContainer" containerID="22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b" Dec 06 00:09:59 crc kubenswrapper[4734]: E1206 00:09:59.630612 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:10:12 crc kubenswrapper[4734]: I1206 00:10:12.360322 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lct7w"] Dec 06 00:10:12 crc kubenswrapper[4734]: I1206 00:10:12.364219 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lct7w" Dec 06 00:10:12 crc kubenswrapper[4734]: I1206 00:10:12.398543 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lct7w"] Dec 06 00:10:12 crc kubenswrapper[4734]: I1206 00:10:12.537457 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7tgp\" (UniqueName: \"kubernetes.io/projected/9afab016-6a20-4e96-b606-54dc1b9da47f-kube-api-access-p7tgp\") pod \"community-operators-lct7w\" (UID: \"9afab016-6a20-4e96-b606-54dc1b9da47f\") " pod="openshift-marketplace/community-operators-lct7w" Dec 06 00:10:12 crc kubenswrapper[4734]: I1206 00:10:12.537630 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9afab016-6a20-4e96-b606-54dc1b9da47f-utilities\") pod \"community-operators-lct7w\" (UID: \"9afab016-6a20-4e96-b606-54dc1b9da47f\") " pod="openshift-marketplace/community-operators-lct7w" Dec 06 00:10:12 crc kubenswrapper[4734]: I1206 00:10:12.537709 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9afab016-6a20-4e96-b606-54dc1b9da47f-catalog-content\") pod \"community-operators-lct7w\" (UID: \"9afab016-6a20-4e96-b606-54dc1b9da47f\") " pod="openshift-marketplace/community-operators-lct7w" Dec 06 00:10:12 crc kubenswrapper[4734]: I1206 00:10:12.640857 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7tgp\" (UniqueName: \"kubernetes.io/projected/9afab016-6a20-4e96-b606-54dc1b9da47f-kube-api-access-p7tgp\") pod \"community-operators-lct7w\" (UID: \"9afab016-6a20-4e96-b606-54dc1b9da47f\") " pod="openshift-marketplace/community-operators-lct7w" Dec 06 00:10:12 crc kubenswrapper[4734]: I1206 00:10:12.640945 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9afab016-6a20-4e96-b606-54dc1b9da47f-utilities\") pod \"community-operators-lct7w\" (UID: \"9afab016-6a20-4e96-b606-54dc1b9da47f\") " pod="openshift-marketplace/community-operators-lct7w" Dec 06 00:10:12 crc kubenswrapper[4734]: I1206 00:10:12.640995 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9afab016-6a20-4e96-b606-54dc1b9da47f-catalog-content\") pod \"community-operators-lct7w\" (UID: \"9afab016-6a20-4e96-b606-54dc1b9da47f\") " pod="openshift-marketplace/community-operators-lct7w" Dec 06 00:10:12 crc kubenswrapper[4734]: I1206 00:10:12.641654 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9afab016-6a20-4e96-b606-54dc1b9da47f-catalog-content\") pod \"community-operators-lct7w\" (UID: \"9afab016-6a20-4e96-b606-54dc1b9da47f\") " pod="openshift-marketplace/community-operators-lct7w" Dec 06 00:10:12 crc kubenswrapper[4734]: I1206 00:10:12.641660 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9afab016-6a20-4e96-b606-54dc1b9da47f-utilities\") pod \"community-operators-lct7w\" (UID: \"9afab016-6a20-4e96-b606-54dc1b9da47f\") " pod="openshift-marketplace/community-operators-lct7w" Dec 06 00:10:12 crc kubenswrapper[4734]: I1206 00:10:12.664660 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7tgp\" (UniqueName: \"kubernetes.io/projected/9afab016-6a20-4e96-b606-54dc1b9da47f-kube-api-access-p7tgp\") pod \"community-operators-lct7w\" (UID: \"9afab016-6a20-4e96-b606-54dc1b9da47f\") " pod="openshift-marketplace/community-operators-lct7w" Dec 06 00:10:12 crc kubenswrapper[4734]: I1206 00:10:12.688978 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lct7w" Dec 06 00:10:13 crc kubenswrapper[4734]: I1206 00:10:13.267786 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lct7w"] Dec 06 00:10:13 crc kubenswrapper[4734]: I1206 00:10:13.615399 4734 scope.go:117] "RemoveContainer" containerID="22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b" Dec 06 00:10:13 crc kubenswrapper[4734]: E1206 00:10:13.615768 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:10:14 crc kubenswrapper[4734]: I1206 00:10:14.139972 4734 generic.go:334] "Generic (PLEG): container finished" podID="9afab016-6a20-4e96-b606-54dc1b9da47f" containerID="874f9cc415eaa1113611521beeef0262f60be4fa9f07d3b535958ca7dbf9da35" exitCode=0 Dec 06 00:10:14 crc kubenswrapper[4734]: I1206 00:10:14.140058 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lct7w" event={"ID":"9afab016-6a20-4e96-b606-54dc1b9da47f","Type":"ContainerDied","Data":"874f9cc415eaa1113611521beeef0262f60be4fa9f07d3b535958ca7dbf9da35"} Dec 06 00:10:14 crc kubenswrapper[4734]: I1206 00:10:14.140310 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lct7w" event={"ID":"9afab016-6a20-4e96-b606-54dc1b9da47f","Type":"ContainerStarted","Data":"90d30956d2a6099e688057adf5e28488291429f32fafd2237f82752e9fed8ef3"} Dec 06 00:10:14 crc kubenswrapper[4734]: I1206 00:10:14.143477 4734 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 00:10:16 crc kubenswrapper[4734]: I1206 00:10:16.166197 4734 generic.go:334] "Generic (PLEG): container finished" podID="9afab016-6a20-4e96-b606-54dc1b9da47f" containerID="6a5a9de08040cbc34985bde218cc83ac7572a76b0029557619960227ff4e76af" exitCode=0 Dec 06 00:10:16 crc kubenswrapper[4734]: I1206 00:10:16.166319 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lct7w" event={"ID":"9afab016-6a20-4e96-b606-54dc1b9da47f","Type":"ContainerDied","Data":"6a5a9de08040cbc34985bde218cc83ac7572a76b0029557619960227ff4e76af"} Dec 06 00:10:17 crc kubenswrapper[4734]: I1206 00:10:17.180698 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lct7w" event={"ID":"9afab016-6a20-4e96-b606-54dc1b9da47f","Type":"ContainerStarted","Data":"2840ca6a1a6aa433993cd826f2019e6e75594a071b3b767998b5d5b36bef1289"} Dec 06 00:10:17 crc kubenswrapper[4734]: I1206 00:10:17.209259 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lct7w" podStartSLOduration=2.753510448 podStartE2EDuration="5.209231744s" podCreationTimestamp="2025-12-06 00:10:12 +0000 UTC" firstStartedPulling="2025-12-06 00:10:14.143185297 +0000 UTC m=+3034.826589573" lastFinishedPulling="2025-12-06 00:10:16.598906603 +0000 UTC m=+3037.282310869" observedRunningTime="2025-12-06 00:10:17.206094877 +0000 UTC m=+3037.889499153" watchObservedRunningTime="2025-12-06 00:10:17.209231744 +0000 UTC m=+3037.892636020" Dec 06 00:10:22 crc kubenswrapper[4734]: I1206 00:10:22.689497 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lct7w" Dec 06 00:10:22 crc kubenswrapper[4734]: I1206 00:10:22.690907 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lct7w" Dec 06 00:10:22 crc kubenswrapper[4734]: I1206 00:10:22.745004 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lct7w" Dec 06 00:10:23 crc kubenswrapper[4734]: I1206 00:10:23.289811 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lct7w" Dec 06 00:10:23 crc kubenswrapper[4734]: I1206 00:10:23.363755 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lct7w"] Dec 06 00:10:25 crc kubenswrapper[4734]: I1206 00:10:25.262674 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lct7w" podUID="9afab016-6a20-4e96-b606-54dc1b9da47f" containerName="registry-server" containerID="cri-o://2840ca6a1a6aa433993cd826f2019e6e75594a071b3b767998b5d5b36bef1289" gracePeriod=2 Dec 06 00:10:25 crc kubenswrapper[4734]: I1206 00:10:25.797952 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lct7w" Dec 06 00:10:25 crc kubenswrapper[4734]: I1206 00:10:25.853860 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9afab016-6a20-4e96-b606-54dc1b9da47f-utilities\") pod \"9afab016-6a20-4e96-b606-54dc1b9da47f\" (UID: \"9afab016-6a20-4e96-b606-54dc1b9da47f\") " Dec 06 00:10:25 crc kubenswrapper[4734]: I1206 00:10:25.854041 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9afab016-6a20-4e96-b606-54dc1b9da47f-catalog-content\") pod \"9afab016-6a20-4e96-b606-54dc1b9da47f\" (UID: \"9afab016-6a20-4e96-b606-54dc1b9da47f\") " Dec 06 00:10:25 crc kubenswrapper[4734]: I1206 00:10:25.854302 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7tgp\" (UniqueName: \"kubernetes.io/projected/9afab016-6a20-4e96-b606-54dc1b9da47f-kube-api-access-p7tgp\") pod \"9afab016-6a20-4e96-b606-54dc1b9da47f\" (UID: \"9afab016-6a20-4e96-b606-54dc1b9da47f\") " Dec 06 00:10:25 crc kubenswrapper[4734]: I1206 00:10:25.855181 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9afab016-6a20-4e96-b606-54dc1b9da47f-utilities" (OuterVolumeSpecName: "utilities") pod "9afab016-6a20-4e96-b606-54dc1b9da47f" (UID: "9afab016-6a20-4e96-b606-54dc1b9da47f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:10:25 crc kubenswrapper[4734]: I1206 00:10:25.862466 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9afab016-6a20-4e96-b606-54dc1b9da47f-kube-api-access-p7tgp" (OuterVolumeSpecName: "kube-api-access-p7tgp") pod "9afab016-6a20-4e96-b606-54dc1b9da47f" (UID: "9afab016-6a20-4e96-b606-54dc1b9da47f"). InnerVolumeSpecName "kube-api-access-p7tgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:10:25 crc kubenswrapper[4734]: I1206 00:10:25.957877 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7tgp\" (UniqueName: \"kubernetes.io/projected/9afab016-6a20-4e96-b606-54dc1b9da47f-kube-api-access-p7tgp\") on node \"crc\" DevicePath \"\"" Dec 06 00:10:25 crc kubenswrapper[4734]: I1206 00:10:25.958370 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9afab016-6a20-4e96-b606-54dc1b9da47f-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 00:10:26 crc kubenswrapper[4734]: I1206 00:10:26.024321 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9afab016-6a20-4e96-b606-54dc1b9da47f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9afab016-6a20-4e96-b606-54dc1b9da47f" (UID: "9afab016-6a20-4e96-b606-54dc1b9da47f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:10:26 crc kubenswrapper[4734]: I1206 00:10:26.062250 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9afab016-6a20-4e96-b606-54dc1b9da47f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 00:10:26 crc kubenswrapper[4734]: I1206 00:10:26.276499 4734 generic.go:334] "Generic (PLEG): container finished" podID="9afab016-6a20-4e96-b606-54dc1b9da47f" containerID="2840ca6a1a6aa433993cd826f2019e6e75594a071b3b767998b5d5b36bef1289" exitCode=0 Dec 06 00:10:26 crc kubenswrapper[4734]: I1206 00:10:26.276572 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lct7w" event={"ID":"9afab016-6a20-4e96-b606-54dc1b9da47f","Type":"ContainerDied","Data":"2840ca6a1a6aa433993cd826f2019e6e75594a071b3b767998b5d5b36bef1289"} Dec 06 00:10:26 crc kubenswrapper[4734]: I1206 00:10:26.276635 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lct7w" Dec 06 00:10:26 crc kubenswrapper[4734]: I1206 00:10:26.276668 4734 scope.go:117] "RemoveContainer" containerID="2840ca6a1a6aa433993cd826f2019e6e75594a071b3b767998b5d5b36bef1289" Dec 06 00:10:26 crc kubenswrapper[4734]: I1206 00:10:26.276641 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lct7w" event={"ID":"9afab016-6a20-4e96-b606-54dc1b9da47f","Type":"ContainerDied","Data":"90d30956d2a6099e688057adf5e28488291429f32fafd2237f82752e9fed8ef3"} Dec 06 00:10:26 crc kubenswrapper[4734]: I1206 00:10:26.313889 4734 scope.go:117] "RemoveContainer" containerID="6a5a9de08040cbc34985bde218cc83ac7572a76b0029557619960227ff4e76af" Dec 06 00:10:26 crc kubenswrapper[4734]: I1206 00:10:26.321167 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lct7w"] Dec 06 00:10:26 crc kubenswrapper[4734]: I1206 00:10:26.332286 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lct7w"] Dec 06 00:10:26 crc kubenswrapper[4734]: I1206 00:10:26.348278 4734 scope.go:117] "RemoveContainer" containerID="874f9cc415eaa1113611521beeef0262f60be4fa9f07d3b535958ca7dbf9da35" Dec 06 00:10:26 crc kubenswrapper[4734]: I1206 00:10:26.396913 4734 scope.go:117] "RemoveContainer" containerID="2840ca6a1a6aa433993cd826f2019e6e75594a071b3b767998b5d5b36bef1289" Dec 06 00:10:26 crc kubenswrapper[4734]: E1206 00:10:26.397377 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2840ca6a1a6aa433993cd826f2019e6e75594a071b3b767998b5d5b36bef1289\": container with ID starting with 2840ca6a1a6aa433993cd826f2019e6e75594a071b3b767998b5d5b36bef1289 not found: ID does not exist" containerID="2840ca6a1a6aa433993cd826f2019e6e75594a071b3b767998b5d5b36bef1289" Dec 06 00:10:26 crc kubenswrapper[4734]: I1206 00:10:26.397414 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2840ca6a1a6aa433993cd826f2019e6e75594a071b3b767998b5d5b36bef1289"} err="failed to get container status \"2840ca6a1a6aa433993cd826f2019e6e75594a071b3b767998b5d5b36bef1289\": rpc error: code = NotFound desc = could not find container \"2840ca6a1a6aa433993cd826f2019e6e75594a071b3b767998b5d5b36bef1289\": container with ID starting with 2840ca6a1a6aa433993cd826f2019e6e75594a071b3b767998b5d5b36bef1289 not found: ID does not exist" Dec 06 00:10:26 crc kubenswrapper[4734]: I1206 00:10:26.397439 4734 scope.go:117] "RemoveContainer" containerID="6a5a9de08040cbc34985bde218cc83ac7572a76b0029557619960227ff4e76af" Dec 06 00:10:26 crc kubenswrapper[4734]: E1206 00:10:26.397819 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a5a9de08040cbc34985bde218cc83ac7572a76b0029557619960227ff4e76af\": container with ID starting with 6a5a9de08040cbc34985bde218cc83ac7572a76b0029557619960227ff4e76af not found: ID does not exist" containerID="6a5a9de08040cbc34985bde218cc83ac7572a76b0029557619960227ff4e76af" Dec 06 00:10:26 crc kubenswrapper[4734]: I1206 00:10:26.397839 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a5a9de08040cbc34985bde218cc83ac7572a76b0029557619960227ff4e76af"} err="failed to get container status \"6a5a9de08040cbc34985bde218cc83ac7572a76b0029557619960227ff4e76af\": rpc error: code = NotFound desc = could not find container \"6a5a9de08040cbc34985bde218cc83ac7572a76b0029557619960227ff4e76af\": container with ID starting with 6a5a9de08040cbc34985bde218cc83ac7572a76b0029557619960227ff4e76af not found: ID does not exist" Dec 06 00:10:26 crc kubenswrapper[4734]: I1206 00:10:26.397858 4734 scope.go:117] "RemoveContainer" containerID="874f9cc415eaa1113611521beeef0262f60be4fa9f07d3b535958ca7dbf9da35" Dec 06 00:10:26 crc kubenswrapper[4734]: E1206 00:10:26.398140 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"874f9cc415eaa1113611521beeef0262f60be4fa9f07d3b535958ca7dbf9da35\": container with ID starting with 874f9cc415eaa1113611521beeef0262f60be4fa9f07d3b535958ca7dbf9da35 not found: ID does not exist" containerID="874f9cc415eaa1113611521beeef0262f60be4fa9f07d3b535958ca7dbf9da35" Dec 06 00:10:26 crc kubenswrapper[4734]: I1206 00:10:26.398204 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"874f9cc415eaa1113611521beeef0262f60be4fa9f07d3b535958ca7dbf9da35"} err="failed to get container status \"874f9cc415eaa1113611521beeef0262f60be4fa9f07d3b535958ca7dbf9da35\": rpc error: code = NotFound desc = could not find container \"874f9cc415eaa1113611521beeef0262f60be4fa9f07d3b535958ca7dbf9da35\": container with ID starting with 874f9cc415eaa1113611521beeef0262f60be4fa9f07d3b535958ca7dbf9da35 not found: ID does not exist" Dec 06 00:10:26 crc kubenswrapper[4734]: I1206 00:10:26.614584 4734 scope.go:117] "RemoveContainer" containerID="22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b" Dec 06 00:10:26 crc kubenswrapper[4734]: E1206 00:10:26.614889 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:10:27 crc kubenswrapper[4734]: I1206 00:10:27.638587 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9afab016-6a20-4e96-b606-54dc1b9da47f" path="/var/lib/kubelet/pods/9afab016-6a20-4e96-b606-54dc1b9da47f/volumes" Dec 06 00:10:38 crc kubenswrapper[4734]: I1206 00:10:38.614208 4734 scope.go:117] "RemoveContainer" containerID="22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b" Dec 06 00:10:38 crc kubenswrapper[4734]: E1206 00:10:38.615260 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:10:49 crc kubenswrapper[4734]: I1206 00:10:49.626057 4734 scope.go:117] "RemoveContainer" containerID="22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b" Dec 06 00:10:49 crc kubenswrapper[4734]: E1206 00:10:49.627305 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:11:04 crc kubenswrapper[4734]: I1206 00:11:04.614312 4734 scope.go:117] "RemoveContainer" containerID="22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b" Dec 06 00:11:04 crc kubenswrapper[4734]: E1206 00:11:04.615602 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:11:15 crc kubenswrapper[4734]: I1206 00:11:15.614416 4734 scope.go:117] "RemoveContainer" containerID="22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b" Dec 06 00:11:15 crc kubenswrapper[4734]: E1206 00:11:15.615459 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:11:30 crc kubenswrapper[4734]: I1206 00:11:30.614939 4734 scope.go:117] "RemoveContainer" containerID="22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b" Dec 06 00:11:30 crc kubenswrapper[4734]: E1206 00:11:30.616069 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:11:43 crc kubenswrapper[4734]: I1206 00:11:43.614724 4734 scope.go:117] "RemoveContainer" containerID="22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b" Dec 06 00:11:43 crc kubenswrapper[4734]: E1206 00:11:43.616194 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:11:54 crc kubenswrapper[4734]: I1206 00:11:54.615079 4734 scope.go:117] "RemoveContainer" containerID="22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b" Dec 06 00:11:54 crc kubenswrapper[4734]: E1206 00:11:54.616230 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:12:07 crc kubenswrapper[4734]: I1206 00:12:07.615575 4734 scope.go:117] "RemoveContainer" containerID="22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b" Dec 06 00:12:07 crc kubenswrapper[4734]: E1206 00:12:07.617275 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:12:18 crc kubenswrapper[4734]: I1206 00:12:18.614311 4734 scope.go:117] "RemoveContainer" containerID="22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b" Dec 06 00:12:18 crc kubenswrapper[4734]: E1206 00:12:18.615399 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:12:31 crc kubenswrapper[4734]: I1206 00:12:31.615293 4734 scope.go:117] "RemoveContainer" containerID="22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b" Dec 06 00:12:31 crc kubenswrapper[4734]: E1206 00:12:31.616387 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:12:45 crc kubenswrapper[4734]: I1206 00:12:45.614801 4734 scope.go:117] "RemoveContainer" containerID="22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b" Dec 06 00:12:45 crc kubenswrapper[4734]: E1206 00:12:45.616044 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:12:59 crc kubenswrapper[4734]: I1206 00:12:59.622448 4734 scope.go:117] "RemoveContainer" containerID="22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b" Dec 06 00:12:59 crc kubenswrapper[4734]: E1206 00:12:59.623409 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:13:12 crc kubenswrapper[4734]: I1206 00:13:12.614129 4734 scope.go:117] "RemoveContainer" containerID="22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b" Dec 06 00:13:12 crc kubenswrapper[4734]: E1206 00:13:12.615053 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:13:27 crc kubenswrapper[4734]: I1206 00:13:27.614731 4734 scope.go:117] "RemoveContainer" containerID="22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b" Dec 06 00:13:27 crc kubenswrapper[4734]: E1206 00:13:27.615934 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:13:41 crc kubenswrapper[4734]: I1206 00:13:41.614401 4734 scope.go:117] "RemoveContainer" containerID="22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b" Dec 06 00:13:41 crc kubenswrapper[4734]: E1206 00:13:41.615572 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:13:53 crc kubenswrapper[4734]: I1206 00:13:53.614750 4734 scope.go:117] "RemoveContainer" containerID="22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b" Dec 06 00:13:53 crc kubenswrapper[4734]: E1206 00:13:53.616042 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:14:06 crc kubenswrapper[4734]: I1206 00:14:06.614433 4734 scope.go:117] "RemoveContainer" containerID="22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b" Dec 06 00:14:06 crc kubenswrapper[4734]: E1206 00:14:06.615577 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:14:19 crc kubenswrapper[4734]: I1206 00:14:19.615352 4734 scope.go:117] "RemoveContainer" containerID="22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b" Dec 06 00:14:19 crc kubenswrapper[4734]: E1206 00:14:19.616427 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:14:30 crc kubenswrapper[4734]: I1206 00:14:30.614804 4734 scope.go:117] "RemoveContainer" containerID="22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b" Dec 06 00:14:30 crc kubenswrapper[4734]: I1206 00:14:30.894023 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerStarted","Data":"445f686f3fcd87f38025591fb4ba15fb2afa8d39ade3e8d4bf67b8c30d64687b"} Dec 06 00:14:46 crc kubenswrapper[4734]: I1206 00:14:46.424723 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wkhvb"] Dec 06 00:14:46 crc kubenswrapper[4734]: E1206 00:14:46.427298 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9afab016-6a20-4e96-b606-54dc1b9da47f" containerName="registry-server" Dec 06 00:14:46 crc kubenswrapper[4734]: I1206 00:14:46.427325 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="9afab016-6a20-4e96-b606-54dc1b9da47f" containerName="registry-server" Dec 06 00:14:46 crc kubenswrapper[4734]: E1206 00:14:46.427340 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9afab016-6a20-4e96-b606-54dc1b9da47f" containerName="extract-content" Dec 06 00:14:46 crc kubenswrapper[4734]: I1206 00:14:46.427348 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="9afab016-6a20-4e96-b606-54dc1b9da47f" containerName="extract-content" Dec 06 00:14:46 crc kubenswrapper[4734]: E1206 00:14:46.427367 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9afab016-6a20-4e96-b606-54dc1b9da47f" containerName="extract-utilities" Dec 06 00:14:46 crc kubenswrapper[4734]: I1206 00:14:46.427376 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="9afab016-6a20-4e96-b606-54dc1b9da47f" containerName="extract-utilities" Dec 06 00:14:46 crc kubenswrapper[4734]: I1206 00:14:46.427794 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="9afab016-6a20-4e96-b606-54dc1b9da47f" containerName="registry-server" Dec 06 00:14:46 crc kubenswrapper[4734]: I1206 00:14:46.429692 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wkhvb" Dec 06 00:14:46 crc kubenswrapper[4734]: I1206 00:14:46.445883 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81575bcb-11f7-409d-a6ce-ebf3fb99cd04-catalog-content\") pod \"certified-operators-wkhvb\" (UID: \"81575bcb-11f7-409d-a6ce-ebf3fb99cd04\") " pod="openshift-marketplace/certified-operators-wkhvb" Dec 06 00:14:46 crc kubenswrapper[4734]: I1206 00:14:46.445973 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgr8b\" (UniqueName: \"kubernetes.io/projected/81575bcb-11f7-409d-a6ce-ebf3fb99cd04-kube-api-access-bgr8b\") pod \"certified-operators-wkhvb\" (UID: \"81575bcb-11f7-409d-a6ce-ebf3fb99cd04\") " pod="openshift-marketplace/certified-operators-wkhvb" Dec 06 00:14:46 crc kubenswrapper[4734]: I1206 00:14:46.446039 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81575bcb-11f7-409d-a6ce-ebf3fb99cd04-utilities\") pod \"certified-operators-wkhvb\" (UID: \"81575bcb-11f7-409d-a6ce-ebf3fb99cd04\") " pod="openshift-marketplace/certified-operators-wkhvb" Dec 06 00:14:46 crc kubenswrapper[4734]: I1206 00:14:46.447158 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wkhvb"] Dec 06 00:14:46 crc kubenswrapper[4734]: I1206 00:14:46.547058 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81575bcb-11f7-409d-a6ce-ebf3fb99cd04-catalog-content\") pod \"certified-operators-wkhvb\" (UID: \"81575bcb-11f7-409d-a6ce-ebf3fb99cd04\") " pod="openshift-marketplace/certified-operators-wkhvb" Dec 06 00:14:46 crc kubenswrapper[4734]: I1206 00:14:46.547113 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgr8b\" (UniqueName: \"kubernetes.io/projected/81575bcb-11f7-409d-a6ce-ebf3fb99cd04-kube-api-access-bgr8b\") pod \"certified-operators-wkhvb\" (UID: \"81575bcb-11f7-409d-a6ce-ebf3fb99cd04\") " pod="openshift-marketplace/certified-operators-wkhvb" Dec 06 00:14:46 crc kubenswrapper[4734]: I1206 00:14:46.547153 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81575bcb-11f7-409d-a6ce-ebf3fb99cd04-utilities\") pod \"certified-operators-wkhvb\" (UID: \"81575bcb-11f7-409d-a6ce-ebf3fb99cd04\") " pod="openshift-marketplace/certified-operators-wkhvb" Dec 06 00:14:46 crc kubenswrapper[4734]: I1206 00:14:46.547759 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81575bcb-11f7-409d-a6ce-ebf3fb99cd04-catalog-content\") pod \"certified-operators-wkhvb\" (UID: \"81575bcb-11f7-409d-a6ce-ebf3fb99cd04\") " pod="openshift-marketplace/certified-operators-wkhvb" Dec 06 00:14:46 crc kubenswrapper[4734]: I1206 00:14:46.547819 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81575bcb-11f7-409d-a6ce-ebf3fb99cd04-utilities\") pod \"certified-operators-wkhvb\" (UID: \"81575bcb-11f7-409d-a6ce-ebf3fb99cd04\") " pod="openshift-marketplace/certified-operators-wkhvb" Dec 06 00:14:46 crc kubenswrapper[4734]: I1206 00:14:46.579497 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgr8b\" (UniqueName: \"kubernetes.io/projected/81575bcb-11f7-409d-a6ce-ebf3fb99cd04-kube-api-access-bgr8b\") pod \"certified-operators-wkhvb\" (UID: \"81575bcb-11f7-409d-a6ce-ebf3fb99cd04\") " pod="openshift-marketplace/certified-operators-wkhvb" Dec 06 00:14:46 crc kubenswrapper[4734]: I1206 00:14:46.803144 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wkhvb" Dec 06 00:14:47 crc kubenswrapper[4734]: I1206 00:14:47.785881 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wkhvb"] Dec 06 00:14:48 crc kubenswrapper[4734]: I1206 00:14:48.086569 4734 generic.go:334] "Generic (PLEG): container finished" podID="81575bcb-11f7-409d-a6ce-ebf3fb99cd04" containerID="d435305aac6ca1b5fc7bc087f0aeccc5a3b2b670ba536044f64005e9cd88a888" exitCode=0 Dec 06 00:14:48 crc kubenswrapper[4734]: I1206 00:14:48.086642 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkhvb" event={"ID":"81575bcb-11f7-409d-a6ce-ebf3fb99cd04","Type":"ContainerDied","Data":"d435305aac6ca1b5fc7bc087f0aeccc5a3b2b670ba536044f64005e9cd88a888"} Dec 06 00:14:48 crc kubenswrapper[4734]: I1206 00:14:48.086702 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkhvb" event={"ID":"81575bcb-11f7-409d-a6ce-ebf3fb99cd04","Type":"ContainerStarted","Data":"bb6ed15fc98d0a44b4c93fa06ef358602fefee888a4be2f477eeae35ddc63188"} Dec 06 00:14:49 crc kubenswrapper[4734]: I1206 00:14:49.109204 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkhvb" event={"ID":"81575bcb-11f7-409d-a6ce-ebf3fb99cd04","Type":"ContainerStarted","Data":"99d78de747c5170145db77f64bd027835047642afd2a9510e3a2c8aa422709bc"} Dec 06 00:14:50 crc kubenswrapper[4734]: I1206 00:14:50.123682 4734 generic.go:334] "Generic (PLEG): container finished" podID="81575bcb-11f7-409d-a6ce-ebf3fb99cd04" containerID="99d78de747c5170145db77f64bd027835047642afd2a9510e3a2c8aa422709bc" exitCode=0 Dec 06 00:14:50 crc kubenswrapper[4734]: I1206 00:14:50.123791 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkhvb" event={"ID":"81575bcb-11f7-409d-a6ce-ebf3fb99cd04","Type":"ContainerDied","Data":"99d78de747c5170145db77f64bd027835047642afd2a9510e3a2c8aa422709bc"} Dec 06 00:14:51 crc kubenswrapper[4734]: I1206 00:14:51.138059 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkhvb" event={"ID":"81575bcb-11f7-409d-a6ce-ebf3fb99cd04","Type":"ContainerStarted","Data":"e1c0b261f7515fd597ff2694facbda7e9df37f2a7cdd1a2cb7e3ef9d7ceaa70e"} Dec 06 00:14:56 crc kubenswrapper[4734]: I1206 00:14:56.804650 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wkhvb" Dec 06 00:14:56 crc kubenswrapper[4734]: I1206 00:14:56.805530 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wkhvb" Dec 06 00:14:56 crc kubenswrapper[4734]: I1206 00:14:56.856553 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wkhvb" Dec 06 00:14:56 crc kubenswrapper[4734]: I1206 00:14:56.876636 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wkhvb" podStartSLOduration=8.446873325 podStartE2EDuration="10.876612626s" podCreationTimestamp="2025-12-06 00:14:46 +0000 UTC" firstStartedPulling="2025-12-06 00:14:48.089211947 +0000 UTC m=+3308.772616223" lastFinishedPulling="2025-12-06 00:14:50.518951248 +0000 UTC m=+3311.202355524" observedRunningTime="2025-12-06 00:14:51.164286578 +0000 UTC m=+3311.847690854" watchObservedRunningTime="2025-12-06 00:14:56.876612626 +0000 UTC m=+3317.560016902" Dec 06 00:14:57 crc kubenswrapper[4734]: I1206 00:14:57.263932 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wkhvb" Dec 06 00:14:57 crc kubenswrapper[4734]: I1206 00:14:57.323053 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wkhvb"] Dec 06 00:14:59 crc kubenswrapper[4734]: I1206 00:14:59.221295 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wkhvb" podUID="81575bcb-11f7-409d-a6ce-ebf3fb99cd04" containerName="registry-server" containerID="cri-o://e1c0b261f7515fd597ff2694facbda7e9df37f2a7cdd1a2cb7e3ef9d7ceaa70e" gracePeriod=2 Dec 06 00:14:59 crc kubenswrapper[4734]: I1206 00:14:59.857177 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wkhvb" Dec 06 00:14:59 crc kubenswrapper[4734]: I1206 00:14:59.970558 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81575bcb-11f7-409d-a6ce-ebf3fb99cd04-catalog-content\") pod \"81575bcb-11f7-409d-a6ce-ebf3fb99cd04\" (UID: \"81575bcb-11f7-409d-a6ce-ebf3fb99cd04\") " Dec 06 00:14:59 crc kubenswrapper[4734]: I1206 00:14:59.970615 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81575bcb-11f7-409d-a6ce-ebf3fb99cd04-utilities\") pod \"81575bcb-11f7-409d-a6ce-ebf3fb99cd04\" (UID: \"81575bcb-11f7-409d-a6ce-ebf3fb99cd04\") " Dec 06 00:14:59 crc kubenswrapper[4734]: I1206 00:14:59.970636 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgr8b\" (UniqueName: \"kubernetes.io/projected/81575bcb-11f7-409d-a6ce-ebf3fb99cd04-kube-api-access-bgr8b\") pod \"81575bcb-11f7-409d-a6ce-ebf3fb99cd04\" (UID: \"81575bcb-11f7-409d-a6ce-ebf3fb99cd04\") " Dec 06 00:14:59 crc kubenswrapper[4734]: I1206 00:14:59.971673 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81575bcb-11f7-409d-a6ce-ebf3fb99cd04-utilities" (OuterVolumeSpecName: "utilities") pod "81575bcb-11f7-409d-a6ce-ebf3fb99cd04" (UID: "81575bcb-11f7-409d-a6ce-ebf3fb99cd04"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:14:59 crc kubenswrapper[4734]: I1206 00:14:59.979836 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81575bcb-11f7-409d-a6ce-ebf3fb99cd04-kube-api-access-bgr8b" (OuterVolumeSpecName: "kube-api-access-bgr8b") pod "81575bcb-11f7-409d-a6ce-ebf3fb99cd04" (UID: "81575bcb-11f7-409d-a6ce-ebf3fb99cd04"). InnerVolumeSpecName "kube-api-access-bgr8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.033936 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81575bcb-11f7-409d-a6ce-ebf3fb99cd04-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81575bcb-11f7-409d-a6ce-ebf3fb99cd04" (UID: "81575bcb-11f7-409d-a6ce-ebf3fb99cd04"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.072976 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81575bcb-11f7-409d-a6ce-ebf3fb99cd04-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.073029 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81575bcb-11f7-409d-a6ce-ebf3fb99cd04-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.073047 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgr8b\" (UniqueName: \"kubernetes.io/projected/81575bcb-11f7-409d-a6ce-ebf3fb99cd04-kube-api-access-bgr8b\") on node \"crc\" DevicePath \"\"" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.161651 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416335-jcz65"] Dec 06 00:15:00 crc kubenswrapper[4734]: E1206 00:15:00.162877 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81575bcb-11f7-409d-a6ce-ebf3fb99cd04" containerName="registry-server" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.162970 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="81575bcb-11f7-409d-a6ce-ebf3fb99cd04" containerName="registry-server" Dec 06 00:15:00 crc kubenswrapper[4734]: E1206 00:15:00.163051 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81575bcb-11f7-409d-a6ce-ebf3fb99cd04" containerName="extract-utilities" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.163124 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="81575bcb-11f7-409d-a6ce-ebf3fb99cd04" containerName="extract-utilities" Dec 06 00:15:00 crc kubenswrapper[4734]: E1206 00:15:00.163200 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81575bcb-11f7-409d-a6ce-ebf3fb99cd04" containerName="extract-content" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.163258 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="81575bcb-11f7-409d-a6ce-ebf3fb99cd04" containerName="extract-content" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.163603 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="81575bcb-11f7-409d-a6ce-ebf3fb99cd04" containerName="registry-server" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.164648 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416335-jcz65" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.167902 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.168566 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.179776 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416335-jcz65"] Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.249950 4734 generic.go:334] "Generic (PLEG): container finished" podID="81575bcb-11f7-409d-a6ce-ebf3fb99cd04" containerID="e1c0b261f7515fd597ff2694facbda7e9df37f2a7cdd1a2cb7e3ef9d7ceaa70e" exitCode=0 Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.250017 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkhvb" event={"ID":"81575bcb-11f7-409d-a6ce-ebf3fb99cd04","Type":"ContainerDied","Data":"e1c0b261f7515fd597ff2694facbda7e9df37f2a7cdd1a2cb7e3ef9d7ceaa70e"} Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.250057 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkhvb" event={"ID":"81575bcb-11f7-409d-a6ce-ebf3fb99cd04","Type":"ContainerDied","Data":"bb6ed15fc98d0a44b4c93fa06ef358602fefee888a4be2f477eeae35ddc63188"} Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.250062 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wkhvb" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.250081 4734 scope.go:117] "RemoveContainer" containerID="e1c0b261f7515fd597ff2694facbda7e9df37f2a7cdd1a2cb7e3ef9d7ceaa70e" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.277435 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df5cb6b9-c714-4b5d-94e2-eb316b0daa4b-secret-volume\") pod \"collect-profiles-29416335-jcz65\" (UID: \"df5cb6b9-c714-4b5d-94e2-eb316b0daa4b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416335-jcz65" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.277554 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df5cb6b9-c714-4b5d-94e2-eb316b0daa4b-config-volume\") pod \"collect-profiles-29416335-jcz65\" (UID: \"df5cb6b9-c714-4b5d-94e2-eb316b0daa4b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416335-jcz65" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.277764 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnp2h\" (UniqueName: \"kubernetes.io/projected/df5cb6b9-c714-4b5d-94e2-eb316b0daa4b-kube-api-access-cnp2h\") pod \"collect-profiles-29416335-jcz65\" (UID: \"df5cb6b9-c714-4b5d-94e2-eb316b0daa4b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416335-jcz65" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.292110 4734 scope.go:117] "RemoveContainer" containerID="99d78de747c5170145db77f64bd027835047642afd2a9510e3a2c8aa422709bc" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.292633 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wkhvb"] Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.305873 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wkhvb"] Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.320620 4734 scope.go:117] "RemoveContainer" containerID="d435305aac6ca1b5fc7bc087f0aeccc5a3b2b670ba536044f64005e9cd88a888" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.372726 4734 scope.go:117] "RemoveContainer" containerID="e1c0b261f7515fd597ff2694facbda7e9df37f2a7cdd1a2cb7e3ef9d7ceaa70e" Dec 06 00:15:00 crc kubenswrapper[4734]: E1206 00:15:00.373426 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1c0b261f7515fd597ff2694facbda7e9df37f2a7cdd1a2cb7e3ef9d7ceaa70e\": container with ID starting with e1c0b261f7515fd597ff2694facbda7e9df37f2a7cdd1a2cb7e3ef9d7ceaa70e not found: ID does not exist" containerID="e1c0b261f7515fd597ff2694facbda7e9df37f2a7cdd1a2cb7e3ef9d7ceaa70e" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.373487 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1c0b261f7515fd597ff2694facbda7e9df37f2a7cdd1a2cb7e3ef9d7ceaa70e"} err="failed to get container status \"e1c0b261f7515fd597ff2694facbda7e9df37f2a7cdd1a2cb7e3ef9d7ceaa70e\": rpc error: code = NotFound desc = could not find container \"e1c0b261f7515fd597ff2694facbda7e9df37f2a7cdd1a2cb7e3ef9d7ceaa70e\": container with ID starting with e1c0b261f7515fd597ff2694facbda7e9df37f2a7cdd1a2cb7e3ef9d7ceaa70e not found: ID does not exist" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.373691 4734 scope.go:117] "RemoveContainer" containerID="99d78de747c5170145db77f64bd027835047642afd2a9510e3a2c8aa422709bc" Dec 06 00:15:00 crc kubenswrapper[4734]: E1206 00:15:00.374637 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99d78de747c5170145db77f64bd027835047642afd2a9510e3a2c8aa422709bc\": container with ID starting with 99d78de747c5170145db77f64bd027835047642afd2a9510e3a2c8aa422709bc not found: ID does not exist" containerID="99d78de747c5170145db77f64bd027835047642afd2a9510e3a2c8aa422709bc" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.374740 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99d78de747c5170145db77f64bd027835047642afd2a9510e3a2c8aa422709bc"} err="failed to get container status \"99d78de747c5170145db77f64bd027835047642afd2a9510e3a2c8aa422709bc\": rpc error: code = NotFound desc = could not find container \"99d78de747c5170145db77f64bd027835047642afd2a9510e3a2c8aa422709bc\": container with ID starting with 99d78de747c5170145db77f64bd027835047642afd2a9510e3a2c8aa422709bc not found: ID does not exist" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.374791 4734 scope.go:117] "RemoveContainer" containerID="d435305aac6ca1b5fc7bc087f0aeccc5a3b2b670ba536044f64005e9cd88a888" Dec 06 00:15:00 crc kubenswrapper[4734]: E1206 00:15:00.375468 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d435305aac6ca1b5fc7bc087f0aeccc5a3b2b670ba536044f64005e9cd88a888\": container with ID starting with d435305aac6ca1b5fc7bc087f0aeccc5a3b2b670ba536044f64005e9cd88a888 not found: ID does not exist" containerID="d435305aac6ca1b5fc7bc087f0aeccc5a3b2b670ba536044f64005e9cd88a888" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.375500 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d435305aac6ca1b5fc7bc087f0aeccc5a3b2b670ba536044f64005e9cd88a888"} err="failed to get container status \"d435305aac6ca1b5fc7bc087f0aeccc5a3b2b670ba536044f64005e9cd88a888\": rpc error: code = NotFound desc = could not find container \"d435305aac6ca1b5fc7bc087f0aeccc5a3b2b670ba536044f64005e9cd88a888\": container with ID starting with d435305aac6ca1b5fc7bc087f0aeccc5a3b2b670ba536044f64005e9cd88a888 not found: ID does not exist" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.381508 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df5cb6b9-c714-4b5d-94e2-eb316b0daa4b-secret-volume\") pod \"collect-profiles-29416335-jcz65\" (UID: \"df5cb6b9-c714-4b5d-94e2-eb316b0daa4b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416335-jcz65" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.381593 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df5cb6b9-c714-4b5d-94e2-eb316b0daa4b-config-volume\") pod \"collect-profiles-29416335-jcz65\" (UID: \"df5cb6b9-c714-4b5d-94e2-eb316b0daa4b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416335-jcz65" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.381680 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnp2h\" (UniqueName: \"kubernetes.io/projected/df5cb6b9-c714-4b5d-94e2-eb316b0daa4b-kube-api-access-cnp2h\") pod \"collect-profiles-29416335-jcz65\" (UID: \"df5cb6b9-c714-4b5d-94e2-eb316b0daa4b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416335-jcz65" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.382592 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df5cb6b9-c714-4b5d-94e2-eb316b0daa4b-config-volume\") pod \"collect-profiles-29416335-jcz65\" (UID: \"df5cb6b9-c714-4b5d-94e2-eb316b0daa4b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416335-jcz65" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.400246 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df5cb6b9-c714-4b5d-94e2-eb316b0daa4b-secret-volume\") pod \"collect-profiles-29416335-jcz65\" (UID: \"df5cb6b9-c714-4b5d-94e2-eb316b0daa4b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416335-jcz65" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.403872 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnp2h\" (UniqueName: \"kubernetes.io/projected/df5cb6b9-c714-4b5d-94e2-eb316b0daa4b-kube-api-access-cnp2h\") pod \"collect-profiles-29416335-jcz65\" (UID: \"df5cb6b9-c714-4b5d-94e2-eb316b0daa4b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416335-jcz65" Dec 06 00:15:00 crc kubenswrapper[4734]: I1206 00:15:00.514322 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416335-jcz65" Dec 06 00:15:01 crc kubenswrapper[4734]: I1206 00:15:01.045440 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416335-jcz65"] Dec 06 00:15:01 crc kubenswrapper[4734]: I1206 00:15:01.267487 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416335-jcz65" event={"ID":"df5cb6b9-c714-4b5d-94e2-eb316b0daa4b","Type":"ContainerStarted","Data":"0d353a511604d09ec5995e66f484d088764a4f49b49b56aeb5d9a1300e229b1e"} Dec 06 00:15:01 crc kubenswrapper[4734]: I1206 00:15:01.267963 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416335-jcz65" event={"ID":"df5cb6b9-c714-4b5d-94e2-eb316b0daa4b","Type":"ContainerStarted","Data":"1a6b94a2901ce36850df8a5d2e1d1b79615701e7a3cec89bd163da393be63495"} Dec 06 00:15:01 crc kubenswrapper[4734]: I1206 00:15:01.306787 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29416335-jcz65" podStartSLOduration=1.306740965 podStartE2EDuration="1.306740965s" podCreationTimestamp="2025-12-06 00:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 00:15:01.284423009 +0000 UTC m=+3321.967827305" watchObservedRunningTime="2025-12-06 00:15:01.306740965 +0000 UTC m=+3321.990145241" Dec 06 00:15:01 crc kubenswrapper[4734]: I1206 00:15:01.627693 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81575bcb-11f7-409d-a6ce-ebf3fb99cd04" path="/var/lib/kubelet/pods/81575bcb-11f7-409d-a6ce-ebf3fb99cd04/volumes" Dec 06 00:15:02 crc kubenswrapper[4734]: I1206 00:15:02.282239 4734 generic.go:334] "Generic (PLEG): container finished" podID="df5cb6b9-c714-4b5d-94e2-eb316b0daa4b" containerID="0d353a511604d09ec5995e66f484d088764a4f49b49b56aeb5d9a1300e229b1e" exitCode=0 Dec 06 00:15:02 crc kubenswrapper[4734]: I1206 00:15:02.282380 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416335-jcz65" event={"ID":"df5cb6b9-c714-4b5d-94e2-eb316b0daa4b","Type":"ContainerDied","Data":"0d353a511604d09ec5995e66f484d088764a4f49b49b56aeb5d9a1300e229b1e"} Dec 06 00:15:03 crc kubenswrapper[4734]: I1206 00:15:03.785259 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416335-jcz65" Dec 06 00:15:03 crc kubenswrapper[4734]: I1206 00:15:03.970071 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df5cb6b9-c714-4b5d-94e2-eb316b0daa4b-secret-volume\") pod \"df5cb6b9-c714-4b5d-94e2-eb316b0daa4b\" (UID: \"df5cb6b9-c714-4b5d-94e2-eb316b0daa4b\") " Dec 06 00:15:03 crc kubenswrapper[4734]: I1206 00:15:03.970734 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df5cb6b9-c714-4b5d-94e2-eb316b0daa4b-config-volume\") pod \"df5cb6b9-c714-4b5d-94e2-eb316b0daa4b\" (UID: \"df5cb6b9-c714-4b5d-94e2-eb316b0daa4b\") " Dec 06 00:15:03 crc kubenswrapper[4734]: I1206 00:15:03.971133 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnp2h\" (UniqueName: \"kubernetes.io/projected/df5cb6b9-c714-4b5d-94e2-eb316b0daa4b-kube-api-access-cnp2h\") pod \"df5cb6b9-c714-4b5d-94e2-eb316b0daa4b\" (UID: \"df5cb6b9-c714-4b5d-94e2-eb316b0daa4b\") " Dec 06 00:15:03 crc kubenswrapper[4734]: I1206 00:15:03.972096 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df5cb6b9-c714-4b5d-94e2-eb316b0daa4b-config-volume" (OuterVolumeSpecName: "config-volume") pod "df5cb6b9-c714-4b5d-94e2-eb316b0daa4b" (UID: "df5cb6b9-c714-4b5d-94e2-eb316b0daa4b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 00:15:03 crc kubenswrapper[4734]: I1206 00:15:03.972264 4734 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df5cb6b9-c714-4b5d-94e2-eb316b0daa4b-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 00:15:03 crc kubenswrapper[4734]: I1206 00:15:03.980141 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df5cb6b9-c714-4b5d-94e2-eb316b0daa4b-kube-api-access-cnp2h" (OuterVolumeSpecName: "kube-api-access-cnp2h") pod "df5cb6b9-c714-4b5d-94e2-eb316b0daa4b" (UID: "df5cb6b9-c714-4b5d-94e2-eb316b0daa4b"). InnerVolumeSpecName "kube-api-access-cnp2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:15:03 crc kubenswrapper[4734]: I1206 00:15:03.983370 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df5cb6b9-c714-4b5d-94e2-eb316b0daa4b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "df5cb6b9-c714-4b5d-94e2-eb316b0daa4b" (UID: "df5cb6b9-c714-4b5d-94e2-eb316b0daa4b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:15:04 crc kubenswrapper[4734]: I1206 00:15:04.074159 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnp2h\" (UniqueName: \"kubernetes.io/projected/df5cb6b9-c714-4b5d-94e2-eb316b0daa4b-kube-api-access-cnp2h\") on node \"crc\" DevicePath \"\"" Dec 06 00:15:04 crc kubenswrapper[4734]: I1206 00:15:04.074207 4734 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df5cb6b9-c714-4b5d-94e2-eb316b0daa4b-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 00:15:04 crc kubenswrapper[4734]: I1206 00:15:04.308012 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416335-jcz65" event={"ID":"df5cb6b9-c714-4b5d-94e2-eb316b0daa4b","Type":"ContainerDied","Data":"1a6b94a2901ce36850df8a5d2e1d1b79615701e7a3cec89bd163da393be63495"} Dec 06 00:15:04 crc kubenswrapper[4734]: I1206 00:15:04.308071 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a6b94a2901ce36850df8a5d2e1d1b79615701e7a3cec89bd163da393be63495" Dec 06 00:15:04 crc kubenswrapper[4734]: I1206 00:15:04.308151 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416335-jcz65" Dec 06 00:15:04 crc kubenswrapper[4734]: I1206 00:15:04.383776 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416290-pwzdw"] Dec 06 00:15:04 crc kubenswrapper[4734]: I1206 00:15:04.396330 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416290-pwzdw"] Dec 06 00:15:05 crc kubenswrapper[4734]: I1206 00:15:05.626956 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="267d14c5-5f5d-424b-8e0c-a7f1bc88892a" path="/var/lib/kubelet/pods/267d14c5-5f5d-424b-8e0c-a7f1bc88892a/volumes" Dec 06 00:15:52 crc kubenswrapper[4734]: I1206 00:15:52.824204 4734 scope.go:117] "RemoveContainer" containerID="22ab8a3516c0f76f56c2665776e8062aac5ee91ace07ec854e33671f4354f137" Dec 06 00:16:06 crc kubenswrapper[4734]: I1206 00:16:06.890824 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k5c7z"] Dec 06 00:16:06 crc kubenswrapper[4734]: E1206 00:16:06.896220 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5cb6b9-c714-4b5d-94e2-eb316b0daa4b" containerName="collect-profiles" Dec 06 00:16:06 crc kubenswrapper[4734]: I1206 00:16:06.896245 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5cb6b9-c714-4b5d-94e2-eb316b0daa4b" containerName="collect-profiles" Dec 06 00:16:06 crc kubenswrapper[4734]: I1206 00:16:06.896472 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="df5cb6b9-c714-4b5d-94e2-eb316b0daa4b" containerName="collect-profiles" Dec 06 00:16:06 crc kubenswrapper[4734]: I1206 00:16:06.898350 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k5c7z" Dec 06 00:16:06 crc kubenswrapper[4734]: I1206 00:16:06.910425 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k5c7z"] Dec 06 00:16:06 crc kubenswrapper[4734]: I1206 00:16:06.979833 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ab978d4-d6b9-4e21-91a2-2794009d4483-utilities\") pod \"redhat-operators-k5c7z\" (UID: \"4ab978d4-d6b9-4e21-91a2-2794009d4483\") " pod="openshift-marketplace/redhat-operators-k5c7z" Dec 06 00:16:06 crc kubenswrapper[4734]: I1206 00:16:06.979929 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b7jl\" (UniqueName: \"kubernetes.io/projected/4ab978d4-d6b9-4e21-91a2-2794009d4483-kube-api-access-6b7jl\") pod \"redhat-operators-k5c7z\" (UID: \"4ab978d4-d6b9-4e21-91a2-2794009d4483\") " pod="openshift-marketplace/redhat-operators-k5c7z" Dec 06 00:16:06 crc kubenswrapper[4734]: I1206 00:16:06.980143 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ab978d4-d6b9-4e21-91a2-2794009d4483-catalog-content\") pod \"redhat-operators-k5c7z\" (UID: \"4ab978d4-d6b9-4e21-91a2-2794009d4483\") " pod="openshift-marketplace/redhat-operators-k5c7z" Dec 06 00:16:07 crc kubenswrapper[4734]: I1206 00:16:07.082505 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ab978d4-d6b9-4e21-91a2-2794009d4483-catalog-content\") pod \"redhat-operators-k5c7z\" (UID: \"4ab978d4-d6b9-4e21-91a2-2794009d4483\") " pod="openshift-marketplace/redhat-operators-k5c7z" Dec 06 00:16:07 crc kubenswrapper[4734]: I1206 00:16:07.083142 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ab978d4-d6b9-4e21-91a2-2794009d4483-utilities\") pod \"redhat-operators-k5c7z\" (UID: \"4ab978d4-d6b9-4e21-91a2-2794009d4483\") " pod="openshift-marketplace/redhat-operators-k5c7z" Dec 06 00:16:07 crc kubenswrapper[4734]: I1206 00:16:07.083187 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b7jl\" (UniqueName: \"kubernetes.io/projected/4ab978d4-d6b9-4e21-91a2-2794009d4483-kube-api-access-6b7jl\") pod \"redhat-operators-k5c7z\" (UID: \"4ab978d4-d6b9-4e21-91a2-2794009d4483\") " pod="openshift-marketplace/redhat-operators-k5c7z" Dec 06 00:16:07 crc kubenswrapper[4734]: I1206 00:16:07.083270 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ab978d4-d6b9-4e21-91a2-2794009d4483-catalog-content\") pod \"redhat-operators-k5c7z\" (UID: \"4ab978d4-d6b9-4e21-91a2-2794009d4483\") " pod="openshift-marketplace/redhat-operators-k5c7z" Dec 06 00:16:07 crc kubenswrapper[4734]: I1206 00:16:07.083746 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ab978d4-d6b9-4e21-91a2-2794009d4483-utilities\") pod \"redhat-operators-k5c7z\" (UID: \"4ab978d4-d6b9-4e21-91a2-2794009d4483\") " pod="openshift-marketplace/redhat-operators-k5c7z" Dec 06 00:16:07 crc kubenswrapper[4734]: I1206 00:16:07.108536 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b7jl\" (UniqueName: \"kubernetes.io/projected/4ab978d4-d6b9-4e21-91a2-2794009d4483-kube-api-access-6b7jl\") pod \"redhat-operators-k5c7z\" (UID: \"4ab978d4-d6b9-4e21-91a2-2794009d4483\") " pod="openshift-marketplace/redhat-operators-k5c7z" Dec 06 00:16:07 crc kubenswrapper[4734]: I1206 00:16:07.231666 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k5c7z" Dec 06 00:16:07 crc kubenswrapper[4734]: I1206 00:16:07.832844 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k5c7z"] Dec 06 00:16:08 crc kubenswrapper[4734]: I1206 00:16:08.039644 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k5c7z" event={"ID":"4ab978d4-d6b9-4e21-91a2-2794009d4483","Type":"ContainerStarted","Data":"8e0c4b14a629e7770f43f2491b572251c60a0c086766903c15f314ea3942dd68"} Dec 06 00:16:09 crc kubenswrapper[4734]: I1206 00:16:09.051034 4734 generic.go:334] "Generic (PLEG): container finished" podID="4ab978d4-d6b9-4e21-91a2-2794009d4483" containerID="90f1274762d124a81ed19b34b15814be83309b675084411d063fc68b1d51bee7" exitCode=0 Dec 06 00:16:09 crc kubenswrapper[4734]: I1206 00:16:09.051100 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k5c7z" event={"ID":"4ab978d4-d6b9-4e21-91a2-2794009d4483","Type":"ContainerDied","Data":"90f1274762d124a81ed19b34b15814be83309b675084411d063fc68b1d51bee7"} Dec 06 00:16:09 crc kubenswrapper[4734]: I1206 00:16:09.054703 4734 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 00:16:11 crc kubenswrapper[4734]: I1206 00:16:11.075297 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k5c7z" event={"ID":"4ab978d4-d6b9-4e21-91a2-2794009d4483","Type":"ContainerStarted","Data":"ddf779e9f299cf21f00a293cf173be2a275ba5f8cfa6e72d97611823d6ba6631"} Dec 06 00:16:13 crc kubenswrapper[4734]: I1206 00:16:13.097414 4734 generic.go:334] "Generic (PLEG): container finished" podID="4ab978d4-d6b9-4e21-91a2-2794009d4483" containerID="ddf779e9f299cf21f00a293cf173be2a275ba5f8cfa6e72d97611823d6ba6631" exitCode=0 Dec 06 00:16:13 crc kubenswrapper[4734]: I1206 00:16:13.097991 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k5c7z" event={"ID":"4ab978d4-d6b9-4e21-91a2-2794009d4483","Type":"ContainerDied","Data":"ddf779e9f299cf21f00a293cf173be2a275ba5f8cfa6e72d97611823d6ba6631"} Dec 06 00:16:14 crc kubenswrapper[4734]: I1206 00:16:14.113661 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k5c7z" event={"ID":"4ab978d4-d6b9-4e21-91a2-2794009d4483","Type":"ContainerStarted","Data":"79b9dba12246119ab8ba1c35d88d21ca0e76a8d2a5dc068253d1c2f473f0d4e2"} Dec 06 00:16:14 crc kubenswrapper[4734]: I1206 00:16:14.187280 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k5c7z" podStartSLOduration=3.725606918 podStartE2EDuration="8.187257685s" podCreationTimestamp="2025-12-06 00:16:06 +0000 UTC" firstStartedPulling="2025-12-06 00:16:09.054226241 +0000 UTC m=+3389.737630517" lastFinishedPulling="2025-12-06 00:16:13.515877008 +0000 UTC m=+3394.199281284" observedRunningTime="2025-12-06 00:16:14.178704476 +0000 UTC m=+3394.862108752" watchObservedRunningTime="2025-12-06 00:16:14.187257685 +0000 UTC m=+3394.870661961" Dec 06 00:16:15 crc kubenswrapper[4734]: I1206 00:16:15.609963 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7pm85"] Dec 06 00:16:15 crc kubenswrapper[4734]: I1206 00:16:15.613068 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7pm85" Dec 06 00:16:15 crc kubenswrapper[4734]: I1206 00:16:15.635427 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pm85"] Dec 06 00:16:15 crc kubenswrapper[4734]: I1206 00:16:15.730631 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e38ec08-407f-4a7a-af27-dbed677adfb2-catalog-content\") pod \"redhat-marketplace-7pm85\" (UID: \"6e38ec08-407f-4a7a-af27-dbed677adfb2\") " pod="openshift-marketplace/redhat-marketplace-7pm85" Dec 06 00:16:15 crc kubenswrapper[4734]: I1206 00:16:15.730788 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdmpk\" (UniqueName: \"kubernetes.io/projected/6e38ec08-407f-4a7a-af27-dbed677adfb2-kube-api-access-sdmpk\") pod \"redhat-marketplace-7pm85\" (UID: \"6e38ec08-407f-4a7a-af27-dbed677adfb2\") " pod="openshift-marketplace/redhat-marketplace-7pm85" Dec 06 00:16:15 crc kubenswrapper[4734]: I1206 00:16:15.730938 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e38ec08-407f-4a7a-af27-dbed677adfb2-utilities\") pod \"redhat-marketplace-7pm85\" (UID: \"6e38ec08-407f-4a7a-af27-dbed677adfb2\") " pod="openshift-marketplace/redhat-marketplace-7pm85" Dec 06 00:16:15 crc kubenswrapper[4734]: I1206 00:16:15.832825 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdmpk\" (UniqueName: \"kubernetes.io/projected/6e38ec08-407f-4a7a-af27-dbed677adfb2-kube-api-access-sdmpk\") pod \"redhat-marketplace-7pm85\" (UID: \"6e38ec08-407f-4a7a-af27-dbed677adfb2\") " pod="openshift-marketplace/redhat-marketplace-7pm85" Dec 06 00:16:15 crc kubenswrapper[4734]: I1206 00:16:15.833343 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e38ec08-407f-4a7a-af27-dbed677adfb2-utilities\") pod \"redhat-marketplace-7pm85\" (UID: \"6e38ec08-407f-4a7a-af27-dbed677adfb2\") " pod="openshift-marketplace/redhat-marketplace-7pm85" Dec 06 00:16:15 crc kubenswrapper[4734]: I1206 00:16:15.833551 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e38ec08-407f-4a7a-af27-dbed677adfb2-catalog-content\") pod \"redhat-marketplace-7pm85\" (UID: \"6e38ec08-407f-4a7a-af27-dbed677adfb2\") " pod="openshift-marketplace/redhat-marketplace-7pm85" Dec 06 00:16:15 crc kubenswrapper[4734]: I1206 00:16:15.834102 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e38ec08-407f-4a7a-af27-dbed677adfb2-utilities\") pod \"redhat-marketplace-7pm85\" (UID: \"6e38ec08-407f-4a7a-af27-dbed677adfb2\") " pod="openshift-marketplace/redhat-marketplace-7pm85" Dec 06 00:16:15 crc kubenswrapper[4734]: I1206 00:16:15.834173 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e38ec08-407f-4a7a-af27-dbed677adfb2-catalog-content\") pod \"redhat-marketplace-7pm85\" (UID: \"6e38ec08-407f-4a7a-af27-dbed677adfb2\") " pod="openshift-marketplace/redhat-marketplace-7pm85" Dec 06 00:16:15 crc kubenswrapper[4734]: I1206 00:16:15.857948 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdmpk\" (UniqueName: \"kubernetes.io/projected/6e38ec08-407f-4a7a-af27-dbed677adfb2-kube-api-access-sdmpk\") pod \"redhat-marketplace-7pm85\" (UID: \"6e38ec08-407f-4a7a-af27-dbed677adfb2\") " pod="openshift-marketplace/redhat-marketplace-7pm85" Dec 06 00:16:15 crc kubenswrapper[4734]: I1206 00:16:15.939664 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7pm85" Dec 06 00:16:16 crc kubenswrapper[4734]: W1206 00:16:16.500248 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e38ec08_407f_4a7a_af27_dbed677adfb2.slice/crio-c901c38f8428ad0d369dea2663671c90a280094a4e0ec59a30424f06868610b9 WatchSource:0}: Error finding container c901c38f8428ad0d369dea2663671c90a280094a4e0ec59a30424f06868610b9: Status 404 returned error can't find the container with id c901c38f8428ad0d369dea2663671c90a280094a4e0ec59a30424f06868610b9 Dec 06 00:16:16 crc kubenswrapper[4734]: I1206 00:16:16.507462 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pm85"] Dec 06 00:16:17 crc kubenswrapper[4734]: I1206 00:16:17.194039 4734 generic.go:334] "Generic (PLEG): container finished" podID="6e38ec08-407f-4a7a-af27-dbed677adfb2" containerID="984992e21d03f9fc583c12c877f6a109d1be32797082dd38459fd54b25acfae4" exitCode=0 Dec 06 00:16:17 crc kubenswrapper[4734]: I1206 00:16:17.194217 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pm85" event={"ID":"6e38ec08-407f-4a7a-af27-dbed677adfb2","Type":"ContainerDied","Data":"984992e21d03f9fc583c12c877f6a109d1be32797082dd38459fd54b25acfae4"} Dec 06 00:16:17 crc kubenswrapper[4734]: I1206 00:16:17.198471 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pm85" event={"ID":"6e38ec08-407f-4a7a-af27-dbed677adfb2","Type":"ContainerStarted","Data":"c901c38f8428ad0d369dea2663671c90a280094a4e0ec59a30424f06868610b9"} Dec 06 00:16:17 crc kubenswrapper[4734]: I1206 00:16:17.231792 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k5c7z" Dec 06 00:16:17 crc kubenswrapper[4734]: I1206 00:16:17.232262 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k5c7z" Dec 06 00:16:18 crc kubenswrapper[4734]: I1206 00:16:18.209512 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pm85" event={"ID":"6e38ec08-407f-4a7a-af27-dbed677adfb2","Type":"ContainerStarted","Data":"f31466a98de420a60dab5bf8b58fb886f6c5a065c347013204c39769c6ead09b"} Dec 06 00:16:18 crc kubenswrapper[4734]: I1206 00:16:18.289205 4734 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k5c7z" podUID="4ab978d4-d6b9-4e21-91a2-2794009d4483" containerName="registry-server" probeResult="failure" output=< Dec 06 00:16:18 crc kubenswrapper[4734]: timeout: failed to connect service ":50051" within 1s Dec 06 00:16:18 crc kubenswrapper[4734]: > Dec 06 00:16:19 crc kubenswrapper[4734]: I1206 00:16:19.224802 4734 generic.go:334] "Generic (PLEG): container finished" podID="6e38ec08-407f-4a7a-af27-dbed677adfb2" containerID="f31466a98de420a60dab5bf8b58fb886f6c5a065c347013204c39769c6ead09b" exitCode=0 Dec 06 00:16:19 crc kubenswrapper[4734]: I1206 00:16:19.224884 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pm85" event={"ID":"6e38ec08-407f-4a7a-af27-dbed677adfb2","Type":"ContainerDied","Data":"f31466a98de420a60dab5bf8b58fb886f6c5a065c347013204c39769c6ead09b"} Dec 06 00:16:20 crc kubenswrapper[4734]: I1206 00:16:20.251371 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pm85" event={"ID":"6e38ec08-407f-4a7a-af27-dbed677adfb2","Type":"ContainerStarted","Data":"fed2b9f20d47801e644d6506bc02c012a61d55de288a2d6c77ad8c3dc3aa1d23"} Dec 06 00:16:20 crc kubenswrapper[4734]: I1206 00:16:20.281545 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7pm85" podStartSLOduration=2.578900761 podStartE2EDuration="5.281500638s" podCreationTimestamp="2025-12-06 00:16:15 +0000 UTC" firstStartedPulling="2025-12-06 00:16:17.196862514 +0000 UTC m=+3397.880266790" lastFinishedPulling="2025-12-06 00:16:19.899462391 +0000 UTC m=+3400.582866667" observedRunningTime="2025-12-06 00:16:20.272314694 +0000 UTC m=+3400.955718960" watchObservedRunningTime="2025-12-06 00:16:20.281500638 +0000 UTC m=+3400.964904914" Dec 06 00:16:25 crc kubenswrapper[4734]: I1206 00:16:25.940054 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7pm85" Dec 06 00:16:25 crc kubenswrapper[4734]: I1206 00:16:25.941719 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7pm85" Dec 06 00:16:25 crc kubenswrapper[4734]: I1206 00:16:25.994885 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7pm85" Dec 06 00:16:26 crc kubenswrapper[4734]: I1206 00:16:26.359124 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7pm85" Dec 06 00:16:26 crc kubenswrapper[4734]: I1206 00:16:26.417457 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pm85"] Dec 06 00:16:27 crc kubenswrapper[4734]: I1206 00:16:27.285850 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k5c7z" Dec 06 00:16:27 crc kubenswrapper[4734]: I1206 00:16:27.352642 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k5c7z" Dec 06 00:16:28 crc kubenswrapper[4734]: I1206 00:16:28.331180 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7pm85" podUID="6e38ec08-407f-4a7a-af27-dbed677adfb2" containerName="registry-server" containerID="cri-o://fed2b9f20d47801e644d6506bc02c012a61d55de288a2d6c77ad8c3dc3aa1d23" gracePeriod=2 Dec 06 00:16:28 crc kubenswrapper[4734]: I1206 00:16:28.640828 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k5c7z"] Dec 06 00:16:28 crc kubenswrapper[4734]: I1206 00:16:28.641211 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k5c7z" podUID="4ab978d4-d6b9-4e21-91a2-2794009d4483" containerName="registry-server" containerID="cri-o://79b9dba12246119ab8ba1c35d88d21ca0e76a8d2a5dc068253d1c2f473f0d4e2" gracePeriod=2 Dec 06 00:16:28 crc kubenswrapper[4734]: I1206 00:16:28.896805 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7pm85" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.067883 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdmpk\" (UniqueName: \"kubernetes.io/projected/6e38ec08-407f-4a7a-af27-dbed677adfb2-kube-api-access-sdmpk\") pod \"6e38ec08-407f-4a7a-af27-dbed677adfb2\" (UID: \"6e38ec08-407f-4a7a-af27-dbed677adfb2\") " Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.068261 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e38ec08-407f-4a7a-af27-dbed677adfb2-catalog-content\") pod \"6e38ec08-407f-4a7a-af27-dbed677adfb2\" (UID: \"6e38ec08-407f-4a7a-af27-dbed677adfb2\") " Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.068422 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e38ec08-407f-4a7a-af27-dbed677adfb2-utilities\") pod \"6e38ec08-407f-4a7a-af27-dbed677adfb2\" (UID: \"6e38ec08-407f-4a7a-af27-dbed677adfb2\") " Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.069370 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e38ec08-407f-4a7a-af27-dbed677adfb2-utilities" (OuterVolumeSpecName: "utilities") pod "6e38ec08-407f-4a7a-af27-dbed677adfb2" (UID: "6e38ec08-407f-4a7a-af27-dbed677adfb2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.076793 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e38ec08-407f-4a7a-af27-dbed677adfb2-kube-api-access-sdmpk" (OuterVolumeSpecName: "kube-api-access-sdmpk") pod "6e38ec08-407f-4a7a-af27-dbed677adfb2" (UID: "6e38ec08-407f-4a7a-af27-dbed677adfb2"). InnerVolumeSpecName "kube-api-access-sdmpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.093019 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e38ec08-407f-4a7a-af27-dbed677adfb2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e38ec08-407f-4a7a-af27-dbed677adfb2" (UID: "6e38ec08-407f-4a7a-af27-dbed677adfb2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.171495 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e38ec08-407f-4a7a-af27-dbed677adfb2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.171567 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e38ec08-407f-4a7a-af27-dbed677adfb2-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.171578 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdmpk\" (UniqueName: \"kubernetes.io/projected/6e38ec08-407f-4a7a-af27-dbed677adfb2-kube-api-access-sdmpk\") on node \"crc\" DevicePath \"\"" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.188401 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k5c7z" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.293963 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b7jl\" (UniqueName: \"kubernetes.io/projected/4ab978d4-d6b9-4e21-91a2-2794009d4483-kube-api-access-6b7jl\") pod \"4ab978d4-d6b9-4e21-91a2-2794009d4483\" (UID: \"4ab978d4-d6b9-4e21-91a2-2794009d4483\") " Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.295113 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ab978d4-d6b9-4e21-91a2-2794009d4483-catalog-content\") pod \"4ab978d4-d6b9-4e21-91a2-2794009d4483\" (UID: \"4ab978d4-d6b9-4e21-91a2-2794009d4483\") " Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.295226 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ab978d4-d6b9-4e21-91a2-2794009d4483-utilities\") pod \"4ab978d4-d6b9-4e21-91a2-2794009d4483\" (UID: \"4ab978d4-d6b9-4e21-91a2-2794009d4483\") " Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.297216 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ab978d4-d6b9-4e21-91a2-2794009d4483-utilities" (OuterVolumeSpecName: "utilities") pod "4ab978d4-d6b9-4e21-91a2-2794009d4483" (UID: "4ab978d4-d6b9-4e21-91a2-2794009d4483"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.300664 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ab978d4-d6b9-4e21-91a2-2794009d4483-kube-api-access-6b7jl" (OuterVolumeSpecName: "kube-api-access-6b7jl") pod "4ab978d4-d6b9-4e21-91a2-2794009d4483" (UID: "4ab978d4-d6b9-4e21-91a2-2794009d4483"). InnerVolumeSpecName "kube-api-access-6b7jl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.345274 4734 generic.go:334] "Generic (PLEG): container finished" podID="4ab978d4-d6b9-4e21-91a2-2794009d4483" containerID="79b9dba12246119ab8ba1c35d88d21ca0e76a8d2a5dc068253d1c2f473f0d4e2" exitCode=0 Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.345377 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k5c7z" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.345374 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k5c7z" event={"ID":"4ab978d4-d6b9-4e21-91a2-2794009d4483","Type":"ContainerDied","Data":"79b9dba12246119ab8ba1c35d88d21ca0e76a8d2a5dc068253d1c2f473f0d4e2"} Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.345568 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k5c7z" event={"ID":"4ab978d4-d6b9-4e21-91a2-2794009d4483","Type":"ContainerDied","Data":"8e0c4b14a629e7770f43f2491b572251c60a0c086766903c15f314ea3942dd68"} Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.345599 4734 scope.go:117] "RemoveContainer" containerID="79b9dba12246119ab8ba1c35d88d21ca0e76a8d2a5dc068253d1c2f473f0d4e2" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.351157 4734 generic.go:334] "Generic (PLEG): container finished" podID="6e38ec08-407f-4a7a-af27-dbed677adfb2" containerID="fed2b9f20d47801e644d6506bc02c012a61d55de288a2d6c77ad8c3dc3aa1d23" exitCode=0 Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.351260 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pm85" event={"ID":"6e38ec08-407f-4a7a-af27-dbed677adfb2","Type":"ContainerDied","Data":"fed2b9f20d47801e644d6506bc02c012a61d55de288a2d6c77ad8c3dc3aa1d23"} Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.351721 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pm85" event={"ID":"6e38ec08-407f-4a7a-af27-dbed677adfb2","Type":"ContainerDied","Data":"c901c38f8428ad0d369dea2663671c90a280094a4e0ec59a30424f06868610b9"} Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.351282 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7pm85" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.377103 4734 scope.go:117] "RemoveContainer" containerID="ddf779e9f299cf21f00a293cf173be2a275ba5f8cfa6e72d97611823d6ba6631" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.400182 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ab978d4-d6b9-4e21-91a2-2794009d4483-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.400228 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b7jl\" (UniqueName: \"kubernetes.io/projected/4ab978d4-d6b9-4e21-91a2-2794009d4483-kube-api-access-6b7jl\") on node \"crc\" DevicePath \"\"" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.410830 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pm85"] Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.417356 4734 scope.go:117] "RemoveContainer" containerID="90f1274762d124a81ed19b34b15814be83309b675084411d063fc68b1d51bee7" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.423384 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pm85"] Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.451401 4734 scope.go:117] "RemoveContainer" containerID="79b9dba12246119ab8ba1c35d88d21ca0e76a8d2a5dc068253d1c2f473f0d4e2" Dec 06 00:16:29 crc kubenswrapper[4734]: E1206 00:16:29.452107 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79b9dba12246119ab8ba1c35d88d21ca0e76a8d2a5dc068253d1c2f473f0d4e2\": container with ID starting with 79b9dba12246119ab8ba1c35d88d21ca0e76a8d2a5dc068253d1c2f473f0d4e2 not found: ID does not exist" containerID="79b9dba12246119ab8ba1c35d88d21ca0e76a8d2a5dc068253d1c2f473f0d4e2" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.452172 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79b9dba12246119ab8ba1c35d88d21ca0e76a8d2a5dc068253d1c2f473f0d4e2"} err="failed to get container status \"79b9dba12246119ab8ba1c35d88d21ca0e76a8d2a5dc068253d1c2f473f0d4e2\": rpc error: code = NotFound desc = could not find container \"79b9dba12246119ab8ba1c35d88d21ca0e76a8d2a5dc068253d1c2f473f0d4e2\": container with ID starting with 79b9dba12246119ab8ba1c35d88d21ca0e76a8d2a5dc068253d1c2f473f0d4e2 not found: ID does not exist" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.452210 4734 scope.go:117] "RemoveContainer" containerID="ddf779e9f299cf21f00a293cf173be2a275ba5f8cfa6e72d97611823d6ba6631" Dec 06 00:16:29 crc kubenswrapper[4734]: E1206 00:16:29.452926 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddf779e9f299cf21f00a293cf173be2a275ba5f8cfa6e72d97611823d6ba6631\": container with ID starting with ddf779e9f299cf21f00a293cf173be2a275ba5f8cfa6e72d97611823d6ba6631 not found: ID does not exist" containerID="ddf779e9f299cf21f00a293cf173be2a275ba5f8cfa6e72d97611823d6ba6631" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.452976 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddf779e9f299cf21f00a293cf173be2a275ba5f8cfa6e72d97611823d6ba6631"} err="failed to get container status \"ddf779e9f299cf21f00a293cf173be2a275ba5f8cfa6e72d97611823d6ba6631\": rpc error: code = NotFound desc = could not find container \"ddf779e9f299cf21f00a293cf173be2a275ba5f8cfa6e72d97611823d6ba6631\": container with ID starting with ddf779e9f299cf21f00a293cf173be2a275ba5f8cfa6e72d97611823d6ba6631 not found: ID does not exist" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.452994 4734 scope.go:117] "RemoveContainer" containerID="90f1274762d124a81ed19b34b15814be83309b675084411d063fc68b1d51bee7" Dec 06 00:16:29 crc kubenswrapper[4734]: E1206 00:16:29.453344 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90f1274762d124a81ed19b34b15814be83309b675084411d063fc68b1d51bee7\": container with ID starting with 90f1274762d124a81ed19b34b15814be83309b675084411d063fc68b1d51bee7 not found: ID does not exist" containerID="90f1274762d124a81ed19b34b15814be83309b675084411d063fc68b1d51bee7" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.453405 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90f1274762d124a81ed19b34b15814be83309b675084411d063fc68b1d51bee7"} err="failed to get container status \"90f1274762d124a81ed19b34b15814be83309b675084411d063fc68b1d51bee7\": rpc error: code = NotFound desc = could not find container \"90f1274762d124a81ed19b34b15814be83309b675084411d063fc68b1d51bee7\": container with ID starting with 90f1274762d124a81ed19b34b15814be83309b675084411d063fc68b1d51bee7 not found: ID does not exist" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.453423 4734 scope.go:117] "RemoveContainer" containerID="fed2b9f20d47801e644d6506bc02c012a61d55de288a2d6c77ad8c3dc3aa1d23" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.454294 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ab978d4-d6b9-4e21-91a2-2794009d4483-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ab978d4-d6b9-4e21-91a2-2794009d4483" (UID: "4ab978d4-d6b9-4e21-91a2-2794009d4483"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.503042 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ab978d4-d6b9-4e21-91a2-2794009d4483-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.525607 4734 scope.go:117] "RemoveContainer" containerID="f31466a98de420a60dab5bf8b58fb886f6c5a065c347013204c39769c6ead09b" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.548461 4734 scope.go:117] "RemoveContainer" containerID="984992e21d03f9fc583c12c877f6a109d1be32797082dd38459fd54b25acfae4" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.599204 4734 scope.go:117] "RemoveContainer" containerID="fed2b9f20d47801e644d6506bc02c012a61d55de288a2d6c77ad8c3dc3aa1d23" Dec 06 00:16:29 crc kubenswrapper[4734]: E1206 00:16:29.600226 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fed2b9f20d47801e644d6506bc02c012a61d55de288a2d6c77ad8c3dc3aa1d23\": container with ID starting with fed2b9f20d47801e644d6506bc02c012a61d55de288a2d6c77ad8c3dc3aa1d23 not found: ID does not exist" containerID="fed2b9f20d47801e644d6506bc02c012a61d55de288a2d6c77ad8c3dc3aa1d23" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.600270 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fed2b9f20d47801e644d6506bc02c012a61d55de288a2d6c77ad8c3dc3aa1d23"} err="failed to get container status \"fed2b9f20d47801e644d6506bc02c012a61d55de288a2d6c77ad8c3dc3aa1d23\": rpc error: code = NotFound desc = could not find container \"fed2b9f20d47801e644d6506bc02c012a61d55de288a2d6c77ad8c3dc3aa1d23\": container with ID starting with fed2b9f20d47801e644d6506bc02c012a61d55de288a2d6c77ad8c3dc3aa1d23 not found: ID does not exist" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.600304 4734 scope.go:117] "RemoveContainer" containerID="f31466a98de420a60dab5bf8b58fb886f6c5a065c347013204c39769c6ead09b" Dec 06 00:16:29 crc kubenswrapper[4734]: E1206 00:16:29.600999 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f31466a98de420a60dab5bf8b58fb886f6c5a065c347013204c39769c6ead09b\": container with ID starting with f31466a98de420a60dab5bf8b58fb886f6c5a065c347013204c39769c6ead09b not found: ID does not exist" containerID="f31466a98de420a60dab5bf8b58fb886f6c5a065c347013204c39769c6ead09b" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.601059 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f31466a98de420a60dab5bf8b58fb886f6c5a065c347013204c39769c6ead09b"} err="failed to get container status \"f31466a98de420a60dab5bf8b58fb886f6c5a065c347013204c39769c6ead09b\": rpc error: code = NotFound desc = could not find container \"f31466a98de420a60dab5bf8b58fb886f6c5a065c347013204c39769c6ead09b\": container with ID starting with f31466a98de420a60dab5bf8b58fb886f6c5a065c347013204c39769c6ead09b not found: ID does not exist" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.601077 4734 scope.go:117] "RemoveContainer" containerID="984992e21d03f9fc583c12c877f6a109d1be32797082dd38459fd54b25acfae4" Dec 06 00:16:29 crc kubenswrapper[4734]: E1206 00:16:29.601472 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"984992e21d03f9fc583c12c877f6a109d1be32797082dd38459fd54b25acfae4\": container with ID starting with 984992e21d03f9fc583c12c877f6a109d1be32797082dd38459fd54b25acfae4 not found: ID does not exist" containerID="984992e21d03f9fc583c12c877f6a109d1be32797082dd38459fd54b25acfae4" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.601503 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"984992e21d03f9fc583c12c877f6a109d1be32797082dd38459fd54b25acfae4"} err="failed to get container status \"984992e21d03f9fc583c12c877f6a109d1be32797082dd38459fd54b25acfae4\": rpc error: code = NotFound desc = could not find container \"984992e21d03f9fc583c12c877f6a109d1be32797082dd38459fd54b25acfae4\": container with ID starting with 984992e21d03f9fc583c12c877f6a109d1be32797082dd38459fd54b25acfae4 not found: ID does not exist" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.632369 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e38ec08-407f-4a7a-af27-dbed677adfb2" path="/var/lib/kubelet/pods/6e38ec08-407f-4a7a-af27-dbed677adfb2/volumes" Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.674456 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k5c7z"] Dec 06 00:16:29 crc kubenswrapper[4734]: I1206 00:16:29.684147 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k5c7z"] Dec 06 00:16:31 crc kubenswrapper[4734]: I1206 00:16:31.627740 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ab978d4-d6b9-4e21-91a2-2794009d4483" path="/var/lib/kubelet/pods/4ab978d4-d6b9-4e21-91a2-2794009d4483/volumes" Dec 06 00:16:50 crc kubenswrapper[4734]: I1206 00:16:50.444804 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 00:16:50 crc kubenswrapper[4734]: I1206 00:16:50.445595 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 00:17:20 crc kubenswrapper[4734]: I1206 00:17:20.444128 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 00:17:20 crc kubenswrapper[4734]: I1206 00:17:20.444763 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 00:17:50 crc kubenswrapper[4734]: I1206 00:17:50.445255 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 00:17:50 crc kubenswrapper[4734]: I1206 00:17:50.446179 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 00:17:50 crc kubenswrapper[4734]: I1206 00:17:50.446246 4734 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 06 00:17:50 crc kubenswrapper[4734]: I1206 00:17:50.447299 4734 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"445f686f3fcd87f38025591fb4ba15fb2afa8d39ade3e8d4bf67b8c30d64687b"} pod="openshift-machine-config-operator/machine-config-daemon-vn94d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 00:17:50 crc kubenswrapper[4734]: I1206 00:17:50.447367 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" containerID="cri-o://445f686f3fcd87f38025591fb4ba15fb2afa8d39ade3e8d4bf67b8c30d64687b" gracePeriod=600 Dec 06 00:17:51 crc kubenswrapper[4734]: I1206 00:17:51.457286 4734 generic.go:334] "Generic (PLEG): container finished" podID="65758270-a7a7-46b5-af95-0588daf9fa86" containerID="445f686f3fcd87f38025591fb4ba15fb2afa8d39ade3e8d4bf67b8c30d64687b" exitCode=0 Dec 06 00:17:51 crc kubenswrapper[4734]: I1206 00:17:51.457363 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerDied","Data":"445f686f3fcd87f38025591fb4ba15fb2afa8d39ade3e8d4bf67b8c30d64687b"} Dec 06 00:17:51 crc kubenswrapper[4734]: I1206 00:17:51.458113 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerStarted","Data":"30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50"} Dec 06 00:17:51 crc kubenswrapper[4734]: I1206 00:17:51.458142 4734 scope.go:117] "RemoveContainer" containerID="22dd21d7db0e95a5c72f8c095ca9b5794ebebca64374ec13a730e95ad481a85b" Dec 06 00:19:50 crc kubenswrapper[4734]: I1206 00:19:50.444801 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 00:19:50 crc kubenswrapper[4734]: I1206 00:19:50.445741 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 00:20:20 crc kubenswrapper[4734]: I1206 00:20:20.444768 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 00:20:20 crc kubenswrapper[4734]: I1206 00:20:20.445653 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 00:20:41 crc kubenswrapper[4734]: I1206 00:20:41.658965 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bp74j"] Dec 06 00:20:41 crc kubenswrapper[4734]: E1206 00:20:41.660334 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e38ec08-407f-4a7a-af27-dbed677adfb2" containerName="extract-content" Dec 06 00:20:41 crc kubenswrapper[4734]: I1206 00:20:41.660350 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e38ec08-407f-4a7a-af27-dbed677adfb2" containerName="extract-content" Dec 06 00:20:41 crc kubenswrapper[4734]: E1206 00:20:41.660365 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e38ec08-407f-4a7a-af27-dbed677adfb2" containerName="registry-server" Dec 06 00:20:41 crc kubenswrapper[4734]: I1206 00:20:41.660372 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e38ec08-407f-4a7a-af27-dbed677adfb2" containerName="registry-server" Dec 06 00:20:41 crc kubenswrapper[4734]: E1206 00:20:41.660383 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ab978d4-d6b9-4e21-91a2-2794009d4483" containerName="extract-utilities" Dec 06 00:20:41 crc kubenswrapper[4734]: I1206 00:20:41.660389 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ab978d4-d6b9-4e21-91a2-2794009d4483" containerName="extract-utilities" Dec 06 00:20:41 crc kubenswrapper[4734]: E1206 00:20:41.660400 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e38ec08-407f-4a7a-af27-dbed677adfb2" containerName="extract-utilities" Dec 06 00:20:41 crc kubenswrapper[4734]: I1206 00:20:41.660407 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e38ec08-407f-4a7a-af27-dbed677adfb2" containerName="extract-utilities" Dec 06 00:20:41 crc kubenswrapper[4734]: E1206 00:20:41.660425 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ab978d4-d6b9-4e21-91a2-2794009d4483" containerName="extract-content" Dec 06 00:20:41 crc kubenswrapper[4734]: I1206 00:20:41.660431 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ab978d4-d6b9-4e21-91a2-2794009d4483" containerName="extract-content" Dec 06 00:20:41 crc kubenswrapper[4734]: E1206 00:20:41.660456 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ab978d4-d6b9-4e21-91a2-2794009d4483" containerName="registry-server" Dec 06 00:20:41 crc kubenswrapper[4734]: I1206 00:20:41.660462 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ab978d4-d6b9-4e21-91a2-2794009d4483" containerName="registry-server" Dec 06 00:20:41 crc kubenswrapper[4734]: I1206 00:20:41.660752 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ab978d4-d6b9-4e21-91a2-2794009d4483" containerName="registry-server" Dec 06 00:20:41 crc kubenswrapper[4734]: I1206 00:20:41.660770 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e38ec08-407f-4a7a-af27-dbed677adfb2" containerName="registry-server" Dec 06 00:20:41 crc kubenswrapper[4734]: I1206 00:20:41.662858 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bp74j" Dec 06 00:20:41 crc kubenswrapper[4734]: I1206 00:20:41.675610 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bp74j"] Dec 06 00:20:41 crc kubenswrapper[4734]: I1206 00:20:41.752821 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4841733-feb5-4510-8b7a-284c296b03e7-catalog-content\") pod \"community-operators-bp74j\" (UID: \"e4841733-feb5-4510-8b7a-284c296b03e7\") " pod="openshift-marketplace/community-operators-bp74j" Dec 06 00:20:41 crc kubenswrapper[4734]: I1206 00:20:41.752926 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4841733-feb5-4510-8b7a-284c296b03e7-utilities\") pod \"community-operators-bp74j\" (UID: \"e4841733-feb5-4510-8b7a-284c296b03e7\") " pod="openshift-marketplace/community-operators-bp74j" Dec 06 00:20:41 crc kubenswrapper[4734]: I1206 00:20:41.753190 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8httm\" (UniqueName: \"kubernetes.io/projected/e4841733-feb5-4510-8b7a-284c296b03e7-kube-api-access-8httm\") pod \"community-operators-bp74j\" (UID: \"e4841733-feb5-4510-8b7a-284c296b03e7\") " pod="openshift-marketplace/community-operators-bp74j" Dec 06 00:20:41 crc kubenswrapper[4734]: I1206 00:20:41.855925 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4841733-feb5-4510-8b7a-284c296b03e7-utilities\") pod \"community-operators-bp74j\" (UID: \"e4841733-feb5-4510-8b7a-284c296b03e7\") " pod="openshift-marketplace/community-operators-bp74j" Dec 06 00:20:41 crc kubenswrapper[4734]: I1206 00:20:41.856054 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8httm\" (UniqueName: \"kubernetes.io/projected/e4841733-feb5-4510-8b7a-284c296b03e7-kube-api-access-8httm\") pod \"community-operators-bp74j\" (UID: \"e4841733-feb5-4510-8b7a-284c296b03e7\") " pod="openshift-marketplace/community-operators-bp74j" Dec 06 00:20:41 crc kubenswrapper[4734]: I1206 00:20:41.856192 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4841733-feb5-4510-8b7a-284c296b03e7-catalog-content\") pod \"community-operators-bp74j\" (UID: \"e4841733-feb5-4510-8b7a-284c296b03e7\") " pod="openshift-marketplace/community-operators-bp74j" Dec 06 00:20:41 crc kubenswrapper[4734]: I1206 00:20:41.856700 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4841733-feb5-4510-8b7a-284c296b03e7-utilities\") pod \"community-operators-bp74j\" (UID: \"e4841733-feb5-4510-8b7a-284c296b03e7\") " pod="openshift-marketplace/community-operators-bp74j" Dec 06 00:20:41 crc kubenswrapper[4734]: I1206 00:20:41.856794 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4841733-feb5-4510-8b7a-284c296b03e7-catalog-content\") pod \"community-operators-bp74j\" (UID: \"e4841733-feb5-4510-8b7a-284c296b03e7\") " pod="openshift-marketplace/community-operators-bp74j" Dec 06 00:20:41 crc kubenswrapper[4734]: I1206 00:20:41.879978 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8httm\" (UniqueName: \"kubernetes.io/projected/e4841733-feb5-4510-8b7a-284c296b03e7-kube-api-access-8httm\") pod \"community-operators-bp74j\" (UID: \"e4841733-feb5-4510-8b7a-284c296b03e7\") " pod="openshift-marketplace/community-operators-bp74j" Dec 06 00:20:41 crc kubenswrapper[4734]: I1206 00:20:41.995837 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bp74j" Dec 06 00:20:42 crc kubenswrapper[4734]: I1206 00:20:42.668626 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bp74j"] Dec 06 00:20:43 crc kubenswrapper[4734]: I1206 00:20:43.216489 4734 generic.go:334] "Generic (PLEG): container finished" podID="e4841733-feb5-4510-8b7a-284c296b03e7" containerID="e76f2b992c022323503d76c4eb85def0433969546a3da7a3011ce1a4cdb48b7e" exitCode=0 Dec 06 00:20:43 crc kubenswrapper[4734]: I1206 00:20:43.216615 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bp74j" event={"ID":"e4841733-feb5-4510-8b7a-284c296b03e7","Type":"ContainerDied","Data":"e76f2b992c022323503d76c4eb85def0433969546a3da7a3011ce1a4cdb48b7e"} Dec 06 00:20:43 crc kubenswrapper[4734]: I1206 00:20:43.216976 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bp74j" event={"ID":"e4841733-feb5-4510-8b7a-284c296b03e7","Type":"ContainerStarted","Data":"9a7d1efdda30d7662bac19cf3997e2fff8f7a27ec3287984987c8d68e774180d"} Dec 06 00:20:44 crc kubenswrapper[4734]: I1206 00:20:44.230719 4734 generic.go:334] "Generic (PLEG): container finished" podID="5d24dfd1-9ec6-4419-84c6-577deb60b95f" containerID="b44b03c27e1d40353aa910c6a11ab1d2aa1041ef998f377eb6d38c6d268f18cb" exitCode=0 Dec 06 00:20:44 crc kubenswrapper[4734]: I1206 00:20:44.230816 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5d24dfd1-9ec6-4419-84c6-577deb60b95f","Type":"ContainerDied","Data":"b44b03c27e1d40353aa910c6a11ab1d2aa1041ef998f377eb6d38c6d268f18cb"} Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.627295 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.751586 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bw9z\" (UniqueName: \"kubernetes.io/projected/5d24dfd1-9ec6-4419-84c6-577deb60b95f-kube-api-access-2bw9z\") pod \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.751646 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.751680 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5d24dfd1-9ec6-4419-84c6-577deb60b95f-ca-certs\") pod \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.751818 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5d24dfd1-9ec6-4419-84c6-577deb60b95f-test-operator-ephemeral-workdir\") pod \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.751971 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5d24dfd1-9ec6-4419-84c6-577deb60b95f-test-operator-ephemeral-temporary\") pod \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.752043 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5d24dfd1-9ec6-4419-84c6-577deb60b95f-openstack-config\") pod \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.752100 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5d24dfd1-9ec6-4419-84c6-577deb60b95f-openstack-config-secret\") pod \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.752172 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d24dfd1-9ec6-4419-84c6-577deb60b95f-config-data\") pod \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.752197 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d24dfd1-9ec6-4419-84c6-577deb60b95f-ssh-key\") pod \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\" (UID: \"5d24dfd1-9ec6-4419-84c6-577deb60b95f\") " Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.752657 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d24dfd1-9ec6-4419-84c6-577deb60b95f-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "5d24dfd1-9ec6-4419-84c6-577deb60b95f" (UID: "5d24dfd1-9ec6-4419-84c6-577deb60b95f"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.753891 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d24dfd1-9ec6-4419-84c6-577deb60b95f-config-data" (OuterVolumeSpecName: "config-data") pod "5d24dfd1-9ec6-4419-84c6-577deb60b95f" (UID: "5d24dfd1-9ec6-4419-84c6-577deb60b95f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.755689 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d24dfd1-9ec6-4419-84c6-577deb60b95f-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "5d24dfd1-9ec6-4419-84c6-577deb60b95f" (UID: "5d24dfd1-9ec6-4419-84c6-577deb60b95f"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.759705 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d24dfd1-9ec6-4419-84c6-577deb60b95f-kube-api-access-2bw9z" (OuterVolumeSpecName: "kube-api-access-2bw9z") pod "5d24dfd1-9ec6-4419-84c6-577deb60b95f" (UID: "5d24dfd1-9ec6-4419-84c6-577deb60b95f"). InnerVolumeSpecName "kube-api-access-2bw9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.759951 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "test-operator-logs") pod "5d24dfd1-9ec6-4419-84c6-577deb60b95f" (UID: "5d24dfd1-9ec6-4419-84c6-577deb60b95f"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.789669 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d24dfd1-9ec6-4419-84c6-577deb60b95f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5d24dfd1-9ec6-4419-84c6-577deb60b95f" (UID: "5d24dfd1-9ec6-4419-84c6-577deb60b95f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.789749 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d24dfd1-9ec6-4419-84c6-577deb60b95f-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "5d24dfd1-9ec6-4419-84c6-577deb60b95f" (UID: "5d24dfd1-9ec6-4419-84c6-577deb60b95f"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.793595 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d24dfd1-9ec6-4419-84c6-577deb60b95f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "5d24dfd1-9ec6-4419-84c6-577deb60b95f" (UID: "5d24dfd1-9ec6-4419-84c6-577deb60b95f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.816106 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d24dfd1-9ec6-4419-84c6-577deb60b95f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "5d24dfd1-9ec6-4419-84c6-577deb60b95f" (UID: "5d24dfd1-9ec6-4419-84c6-577deb60b95f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.854987 4734 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/5d24dfd1-9ec6-4419-84c6-577deb60b95f-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.855313 4734 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/5d24dfd1-9ec6-4419-84c6-577deb60b95f-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.855398 4734 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5d24dfd1-9ec6-4419-84c6-577deb60b95f-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.855480 4734 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5d24dfd1-9ec6-4419-84c6-577deb60b95f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.855578 4734 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d24dfd1-9ec6-4419-84c6-577deb60b95f-config-data\") on node \"crc\" DevicePath \"\"" Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.855662 4734 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d24dfd1-9ec6-4419-84c6-577deb60b95f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.855733 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bw9z\" (UniqueName: \"kubernetes.io/projected/5d24dfd1-9ec6-4419-84c6-577deb60b95f-kube-api-access-2bw9z\") on node \"crc\" DevicePath \"\"" Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.855824 4734 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.855928 4734 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/5d24dfd1-9ec6-4419-84c6-577deb60b95f-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.887400 4734 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 06 00:20:45 crc kubenswrapper[4734]: I1206 00:20:45.958319 4734 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 06 00:20:46 crc kubenswrapper[4734]: I1206 00:20:46.256622 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"5d24dfd1-9ec6-4419-84c6-577deb60b95f","Type":"ContainerDied","Data":"f61033f57cc009ff380a8bcc069bd70943668c113ca81b22873ac528d506e8a3"} Dec 06 00:20:46 crc kubenswrapper[4734]: I1206 00:20:46.256687 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f61033f57cc009ff380a8bcc069bd70943668c113ca81b22873ac528d506e8a3" Dec 06 00:20:46 crc kubenswrapper[4734]: I1206 00:20:46.256702 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 06 00:20:48 crc kubenswrapper[4734]: I1206 00:20:48.277089 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bp74j" event={"ID":"e4841733-feb5-4510-8b7a-284c296b03e7","Type":"ContainerStarted","Data":"54f236e7891cf1dc3e42ba818c07537d2e6fc8fff835d8f00058e890e74c6171"} Dec 06 00:20:49 crc kubenswrapper[4734]: I1206 00:20:49.292766 4734 generic.go:334] "Generic (PLEG): container finished" podID="e4841733-feb5-4510-8b7a-284c296b03e7" containerID="54f236e7891cf1dc3e42ba818c07537d2e6fc8fff835d8f00058e890e74c6171" exitCode=0 Dec 06 00:20:49 crc kubenswrapper[4734]: I1206 00:20:49.292869 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bp74j" event={"ID":"e4841733-feb5-4510-8b7a-284c296b03e7","Type":"ContainerDied","Data":"54f236e7891cf1dc3e42ba818c07537d2e6fc8fff835d8f00058e890e74c6171"} Dec 06 00:20:50 crc kubenswrapper[4734]: I1206 00:20:50.307724 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bp74j" event={"ID":"e4841733-feb5-4510-8b7a-284c296b03e7","Type":"ContainerStarted","Data":"fdbc09e05615321e0179b9fa24b4d5c2b4676d317e65a0174207e88008569948"} Dec 06 00:20:50 crc kubenswrapper[4734]: I1206 00:20:50.336709 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bp74j" podStartSLOduration=2.772656613 podStartE2EDuration="9.336683362s" podCreationTimestamp="2025-12-06 00:20:41 +0000 UTC" firstStartedPulling="2025-12-06 00:20:43.219457148 +0000 UTC m=+3663.902861424" lastFinishedPulling="2025-12-06 00:20:49.783483907 +0000 UTC m=+3670.466888173" observedRunningTime="2025-12-06 00:20:50.329546488 +0000 UTC m=+3671.012950774" watchObservedRunningTime="2025-12-06 00:20:50.336683362 +0000 UTC m=+3671.020087638" Dec 06 00:20:50 crc kubenswrapper[4734]: I1206 00:20:50.445051 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 00:20:50 crc kubenswrapper[4734]: I1206 00:20:50.445127 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 00:20:50 crc kubenswrapper[4734]: I1206 00:20:50.445188 4734 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 06 00:20:50 crc kubenswrapper[4734]: I1206 00:20:50.446378 4734 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50"} pod="openshift-machine-config-operator/machine-config-daemon-vn94d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 00:20:50 crc kubenswrapper[4734]: I1206 00:20:50.446464 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" containerID="cri-o://30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50" gracePeriod=600 Dec 06 00:20:50 crc kubenswrapper[4734]: E1206 00:20:50.583633 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:20:51 crc kubenswrapper[4734]: I1206 00:20:51.321909 4734 generic.go:334] "Generic (PLEG): container finished" podID="65758270-a7a7-46b5-af95-0588daf9fa86" containerID="30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50" exitCode=0 Dec 06 00:20:51 crc kubenswrapper[4734]: I1206 00:20:51.321987 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerDied","Data":"30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50"} Dec 06 00:20:51 crc kubenswrapper[4734]: I1206 00:20:51.322450 4734 scope.go:117] "RemoveContainer" containerID="445f686f3fcd87f38025591fb4ba15fb2afa8d39ade3e8d4bf67b8c30d64687b" Dec 06 00:20:51 crc kubenswrapper[4734]: I1206 00:20:51.324151 4734 scope.go:117] "RemoveContainer" containerID="30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50" Dec 06 00:20:51 crc kubenswrapper[4734]: E1206 00:20:51.324472 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:20:51 crc kubenswrapper[4734]: I1206 00:20:51.916075 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 06 00:20:51 crc kubenswrapper[4734]: E1206 00:20:51.927865 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d24dfd1-9ec6-4419-84c6-577deb60b95f" containerName="tempest-tests-tempest-tests-runner" Dec 06 00:20:51 crc kubenswrapper[4734]: I1206 00:20:51.927895 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d24dfd1-9ec6-4419-84c6-577deb60b95f" containerName="tempest-tests-tempest-tests-runner" Dec 06 00:20:51 crc kubenswrapper[4734]: I1206 00:20:51.928199 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d24dfd1-9ec6-4419-84c6-577deb60b95f" containerName="tempest-tests-tempest-tests-runner" Dec 06 00:20:51 crc kubenswrapper[4734]: I1206 00:20:51.929114 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 00:20:51 crc kubenswrapper[4734]: I1206 00:20:51.934219 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-ccp8x" Dec 06 00:20:51 crc kubenswrapper[4734]: I1206 00:20:51.938328 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 06 00:20:52 crc kubenswrapper[4734]: I1206 00:20:52.003348 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bp74j" Dec 06 00:20:52 crc kubenswrapper[4734]: I1206 00:20:52.003442 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bp74j" Dec 06 00:20:52 crc kubenswrapper[4734]: I1206 00:20:52.063776 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bp74j" Dec 06 00:20:52 crc kubenswrapper[4734]: I1206 00:20:52.122283 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e615b54d-cff3-4de2-8569-9c492e2234e0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 00:20:52 crc kubenswrapper[4734]: I1206 00:20:52.122479 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjlhh\" (UniqueName: \"kubernetes.io/projected/e615b54d-cff3-4de2-8569-9c492e2234e0-kube-api-access-jjlhh\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e615b54d-cff3-4de2-8569-9c492e2234e0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 00:20:52 crc kubenswrapper[4734]: I1206 00:20:52.225138 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjlhh\" (UniqueName: \"kubernetes.io/projected/e615b54d-cff3-4de2-8569-9c492e2234e0-kube-api-access-jjlhh\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e615b54d-cff3-4de2-8569-9c492e2234e0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 00:20:52 crc kubenswrapper[4734]: I1206 00:20:52.225446 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e615b54d-cff3-4de2-8569-9c492e2234e0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 00:20:52 crc kubenswrapper[4734]: I1206 00:20:52.226131 4734 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e615b54d-cff3-4de2-8569-9c492e2234e0\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 00:20:52 crc kubenswrapper[4734]: I1206 00:20:52.247764 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjlhh\" (UniqueName: \"kubernetes.io/projected/e615b54d-cff3-4de2-8569-9c492e2234e0-kube-api-access-jjlhh\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e615b54d-cff3-4de2-8569-9c492e2234e0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 00:20:52 crc kubenswrapper[4734]: I1206 00:20:52.255397 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e615b54d-cff3-4de2-8569-9c492e2234e0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 00:20:52 crc kubenswrapper[4734]: I1206 00:20:52.554927 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 06 00:20:53 crc kubenswrapper[4734]: I1206 00:20:53.030492 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 06 00:20:53 crc kubenswrapper[4734]: W1206 00:20:53.031686 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode615b54d_cff3_4de2_8569_9c492e2234e0.slice/crio-e92356d0378c14a71ef76c495edc47d4a05a2f05355627497992df706eb76f7a WatchSource:0}: Error finding container e92356d0378c14a71ef76c495edc47d4a05a2f05355627497992df706eb76f7a: Status 404 returned error can't find the container with id e92356d0378c14a71ef76c495edc47d4a05a2f05355627497992df706eb76f7a Dec 06 00:20:53 crc kubenswrapper[4734]: I1206 00:20:53.354858 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e615b54d-cff3-4de2-8569-9c492e2234e0","Type":"ContainerStarted","Data":"e92356d0378c14a71ef76c495edc47d4a05a2f05355627497992df706eb76f7a"} Dec 06 00:20:54 crc kubenswrapper[4734]: I1206 00:20:54.366471 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e615b54d-cff3-4de2-8569-9c492e2234e0","Type":"ContainerStarted","Data":"0098b436f918a495b39c4d9ff559ac0dcd67753af00a8560ee97ea3367e99651"} Dec 06 00:20:54 crc kubenswrapper[4734]: I1206 00:20:54.391665 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.404838824 podStartE2EDuration="3.391635089s" podCreationTimestamp="2025-12-06 00:20:51 +0000 UTC" firstStartedPulling="2025-12-06 00:20:53.034555264 +0000 UTC m=+3673.717959540" lastFinishedPulling="2025-12-06 00:20:54.021351529 +0000 UTC m=+3674.704755805" observedRunningTime="2025-12-06 00:20:54.38468507 +0000 UTC m=+3675.068089356" watchObservedRunningTime="2025-12-06 00:20:54.391635089 +0000 UTC m=+3675.075039365" Dec 06 00:21:02 crc kubenswrapper[4734]: I1206 00:21:02.047777 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bp74j" Dec 06 00:21:02 crc kubenswrapper[4734]: I1206 00:21:02.109424 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bp74j"] Dec 06 00:21:02 crc kubenswrapper[4734]: I1206 00:21:02.445108 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bp74j" podUID="e4841733-feb5-4510-8b7a-284c296b03e7" containerName="registry-server" containerID="cri-o://fdbc09e05615321e0179b9fa24b4d5c2b4676d317e65a0174207e88008569948" gracePeriod=2 Dec 06 00:21:02 crc kubenswrapper[4734]: I1206 00:21:02.918646 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bp74j" Dec 06 00:21:02 crc kubenswrapper[4734]: I1206 00:21:02.977464 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4841733-feb5-4510-8b7a-284c296b03e7-utilities\") pod \"e4841733-feb5-4510-8b7a-284c296b03e7\" (UID: \"e4841733-feb5-4510-8b7a-284c296b03e7\") " Dec 06 00:21:02 crc kubenswrapper[4734]: I1206 00:21:02.977652 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8httm\" (UniqueName: \"kubernetes.io/projected/e4841733-feb5-4510-8b7a-284c296b03e7-kube-api-access-8httm\") pod \"e4841733-feb5-4510-8b7a-284c296b03e7\" (UID: \"e4841733-feb5-4510-8b7a-284c296b03e7\") " Dec 06 00:21:02 crc kubenswrapper[4734]: I1206 00:21:02.977802 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4841733-feb5-4510-8b7a-284c296b03e7-catalog-content\") pod \"e4841733-feb5-4510-8b7a-284c296b03e7\" (UID: \"e4841733-feb5-4510-8b7a-284c296b03e7\") " Dec 06 00:21:02 crc kubenswrapper[4734]: I1206 00:21:02.979017 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4841733-feb5-4510-8b7a-284c296b03e7-utilities" (OuterVolumeSpecName: "utilities") pod "e4841733-feb5-4510-8b7a-284c296b03e7" (UID: "e4841733-feb5-4510-8b7a-284c296b03e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:21:02 crc kubenswrapper[4734]: I1206 00:21:02.986934 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4841733-feb5-4510-8b7a-284c296b03e7-kube-api-access-8httm" (OuterVolumeSpecName: "kube-api-access-8httm") pod "e4841733-feb5-4510-8b7a-284c296b03e7" (UID: "e4841733-feb5-4510-8b7a-284c296b03e7"). InnerVolumeSpecName "kube-api-access-8httm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:21:03 crc kubenswrapper[4734]: I1206 00:21:03.048151 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4841733-feb5-4510-8b7a-284c296b03e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4841733-feb5-4510-8b7a-284c296b03e7" (UID: "e4841733-feb5-4510-8b7a-284c296b03e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:21:03 crc kubenswrapper[4734]: I1206 00:21:03.080315 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4841733-feb5-4510-8b7a-284c296b03e7-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 00:21:03 crc kubenswrapper[4734]: I1206 00:21:03.080354 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8httm\" (UniqueName: \"kubernetes.io/projected/e4841733-feb5-4510-8b7a-284c296b03e7-kube-api-access-8httm\") on node \"crc\" DevicePath \"\"" Dec 06 00:21:03 crc kubenswrapper[4734]: I1206 00:21:03.080369 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4841733-feb5-4510-8b7a-284c296b03e7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 00:21:03 crc kubenswrapper[4734]: I1206 00:21:03.459092 4734 generic.go:334] "Generic (PLEG): container finished" podID="e4841733-feb5-4510-8b7a-284c296b03e7" containerID="fdbc09e05615321e0179b9fa24b4d5c2b4676d317e65a0174207e88008569948" exitCode=0 Dec 06 00:21:03 crc kubenswrapper[4734]: I1206 00:21:03.459168 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bp74j" Dec 06 00:21:03 crc kubenswrapper[4734]: I1206 00:21:03.459175 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bp74j" event={"ID":"e4841733-feb5-4510-8b7a-284c296b03e7","Type":"ContainerDied","Data":"fdbc09e05615321e0179b9fa24b4d5c2b4676d317e65a0174207e88008569948"} Dec 06 00:21:03 crc kubenswrapper[4734]: I1206 00:21:03.459428 4734 scope.go:117] "RemoveContainer" containerID="fdbc09e05615321e0179b9fa24b4d5c2b4676d317e65a0174207e88008569948" Dec 06 00:21:03 crc kubenswrapper[4734]: I1206 00:21:03.459483 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bp74j" event={"ID":"e4841733-feb5-4510-8b7a-284c296b03e7","Type":"ContainerDied","Data":"9a7d1efdda30d7662bac19cf3997e2fff8f7a27ec3287984987c8d68e774180d"} Dec 06 00:21:03 crc kubenswrapper[4734]: I1206 00:21:03.504572 4734 scope.go:117] "RemoveContainer" containerID="54f236e7891cf1dc3e42ba818c07537d2e6fc8fff835d8f00058e890e74c6171" Dec 06 00:21:03 crc kubenswrapper[4734]: I1206 00:21:03.517156 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bp74j"] Dec 06 00:21:03 crc kubenswrapper[4734]: I1206 00:21:03.530316 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bp74j"] Dec 06 00:21:03 crc kubenswrapper[4734]: I1206 00:21:03.559705 4734 scope.go:117] "RemoveContainer" containerID="e76f2b992c022323503d76c4eb85def0433969546a3da7a3011ce1a4cdb48b7e" Dec 06 00:21:03 crc kubenswrapper[4734]: I1206 00:21:03.591986 4734 scope.go:117] "RemoveContainer" containerID="fdbc09e05615321e0179b9fa24b4d5c2b4676d317e65a0174207e88008569948" Dec 06 00:21:03 crc kubenswrapper[4734]: E1206 00:21:03.592690 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdbc09e05615321e0179b9fa24b4d5c2b4676d317e65a0174207e88008569948\": container with ID starting with fdbc09e05615321e0179b9fa24b4d5c2b4676d317e65a0174207e88008569948 not found: ID does not exist" containerID="fdbc09e05615321e0179b9fa24b4d5c2b4676d317e65a0174207e88008569948" Dec 06 00:21:03 crc kubenswrapper[4734]: I1206 00:21:03.592742 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdbc09e05615321e0179b9fa24b4d5c2b4676d317e65a0174207e88008569948"} err="failed to get container status \"fdbc09e05615321e0179b9fa24b4d5c2b4676d317e65a0174207e88008569948\": rpc error: code = NotFound desc = could not find container \"fdbc09e05615321e0179b9fa24b4d5c2b4676d317e65a0174207e88008569948\": container with ID starting with fdbc09e05615321e0179b9fa24b4d5c2b4676d317e65a0174207e88008569948 not found: ID does not exist" Dec 06 00:21:03 crc kubenswrapper[4734]: I1206 00:21:03.592775 4734 scope.go:117] "RemoveContainer" containerID="54f236e7891cf1dc3e42ba818c07537d2e6fc8fff835d8f00058e890e74c6171" Dec 06 00:21:03 crc kubenswrapper[4734]: E1206 00:21:03.593313 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54f236e7891cf1dc3e42ba818c07537d2e6fc8fff835d8f00058e890e74c6171\": container with ID starting with 54f236e7891cf1dc3e42ba818c07537d2e6fc8fff835d8f00058e890e74c6171 not found: ID does not exist" containerID="54f236e7891cf1dc3e42ba818c07537d2e6fc8fff835d8f00058e890e74c6171" Dec 06 00:21:03 crc kubenswrapper[4734]: I1206 00:21:03.593346 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54f236e7891cf1dc3e42ba818c07537d2e6fc8fff835d8f00058e890e74c6171"} err="failed to get container status \"54f236e7891cf1dc3e42ba818c07537d2e6fc8fff835d8f00058e890e74c6171\": rpc error: code = NotFound desc = could not find container \"54f236e7891cf1dc3e42ba818c07537d2e6fc8fff835d8f00058e890e74c6171\": container with ID starting with 54f236e7891cf1dc3e42ba818c07537d2e6fc8fff835d8f00058e890e74c6171 not found: ID does not exist" Dec 06 00:21:03 crc kubenswrapper[4734]: I1206 00:21:03.593365 4734 scope.go:117] "RemoveContainer" containerID="e76f2b992c022323503d76c4eb85def0433969546a3da7a3011ce1a4cdb48b7e" Dec 06 00:21:03 crc kubenswrapper[4734]: E1206 00:21:03.593681 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e76f2b992c022323503d76c4eb85def0433969546a3da7a3011ce1a4cdb48b7e\": container with ID starting with e76f2b992c022323503d76c4eb85def0433969546a3da7a3011ce1a4cdb48b7e not found: ID does not exist" containerID="e76f2b992c022323503d76c4eb85def0433969546a3da7a3011ce1a4cdb48b7e" Dec 06 00:21:03 crc kubenswrapper[4734]: I1206 00:21:03.593723 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e76f2b992c022323503d76c4eb85def0433969546a3da7a3011ce1a4cdb48b7e"} err="failed to get container status \"e76f2b992c022323503d76c4eb85def0433969546a3da7a3011ce1a4cdb48b7e\": rpc error: code = NotFound desc = could not find container \"e76f2b992c022323503d76c4eb85def0433969546a3da7a3011ce1a4cdb48b7e\": container with ID starting with e76f2b992c022323503d76c4eb85def0433969546a3da7a3011ce1a4cdb48b7e not found: ID does not exist" Dec 06 00:21:03 crc kubenswrapper[4734]: I1206 00:21:03.616781 4734 scope.go:117] "RemoveContainer" containerID="30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50" Dec 06 00:21:03 crc kubenswrapper[4734]: E1206 00:21:03.617158 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:21:03 crc kubenswrapper[4734]: I1206 00:21:03.632461 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4841733-feb5-4510-8b7a-284c296b03e7" path="/var/lib/kubelet/pods/e4841733-feb5-4510-8b7a-284c296b03e7/volumes" Dec 06 00:21:17 crc kubenswrapper[4734]: I1206 00:21:17.614243 4734 scope.go:117] "RemoveContainer" containerID="30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50" Dec 06 00:21:17 crc kubenswrapper[4734]: E1206 00:21:17.616893 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:21:20 crc kubenswrapper[4734]: I1206 00:21:20.254305 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jnf5s/must-gather-hktf7"] Dec 06 00:21:20 crc kubenswrapper[4734]: E1206 00:21:20.255184 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4841733-feb5-4510-8b7a-284c296b03e7" containerName="extract-content" Dec 06 00:21:20 crc kubenswrapper[4734]: I1206 00:21:20.255198 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4841733-feb5-4510-8b7a-284c296b03e7" containerName="extract-content" Dec 06 00:21:20 crc kubenswrapper[4734]: E1206 00:21:20.255222 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4841733-feb5-4510-8b7a-284c296b03e7" containerName="extract-utilities" Dec 06 00:21:20 crc kubenswrapper[4734]: I1206 00:21:20.255228 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4841733-feb5-4510-8b7a-284c296b03e7" containerName="extract-utilities" Dec 06 00:21:20 crc kubenswrapper[4734]: E1206 00:21:20.255241 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4841733-feb5-4510-8b7a-284c296b03e7" containerName="registry-server" Dec 06 00:21:20 crc kubenswrapper[4734]: I1206 00:21:20.255247 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4841733-feb5-4510-8b7a-284c296b03e7" containerName="registry-server" Dec 06 00:21:20 crc kubenswrapper[4734]: I1206 00:21:20.255437 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4841733-feb5-4510-8b7a-284c296b03e7" containerName="registry-server" Dec 06 00:21:20 crc kubenswrapper[4734]: I1206 00:21:20.256771 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jnf5s/must-gather-hktf7" Dec 06 00:21:20 crc kubenswrapper[4734]: I1206 00:21:20.262841 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jnf5s"/"openshift-service-ca.crt" Dec 06 00:21:20 crc kubenswrapper[4734]: I1206 00:21:20.263130 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-jnf5s"/"default-dockercfg-ct59b" Dec 06 00:21:20 crc kubenswrapper[4734]: I1206 00:21:20.266992 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-jnf5s"/"kube-root-ca.crt" Dec 06 00:21:20 crc kubenswrapper[4734]: I1206 00:21:20.269260 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jnf5s/must-gather-hktf7"] Dec 06 00:21:20 crc kubenswrapper[4734]: I1206 00:21:20.328569 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8d96ce40-7782-46c2-816b-6e792c06b2f1-must-gather-output\") pod \"must-gather-hktf7\" (UID: \"8d96ce40-7782-46c2-816b-6e792c06b2f1\") " pod="openshift-must-gather-jnf5s/must-gather-hktf7" Dec 06 00:21:20 crc kubenswrapper[4734]: I1206 00:21:20.328671 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82w7j\" (UniqueName: \"kubernetes.io/projected/8d96ce40-7782-46c2-816b-6e792c06b2f1-kube-api-access-82w7j\") pod \"must-gather-hktf7\" (UID: \"8d96ce40-7782-46c2-816b-6e792c06b2f1\") " pod="openshift-must-gather-jnf5s/must-gather-hktf7" Dec 06 00:21:20 crc kubenswrapper[4734]: I1206 00:21:20.431168 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82w7j\" (UniqueName: \"kubernetes.io/projected/8d96ce40-7782-46c2-816b-6e792c06b2f1-kube-api-access-82w7j\") pod \"must-gather-hktf7\" (UID: \"8d96ce40-7782-46c2-816b-6e792c06b2f1\") " pod="openshift-must-gather-jnf5s/must-gather-hktf7" Dec 06 00:21:20 crc kubenswrapper[4734]: I1206 00:21:20.431322 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8d96ce40-7782-46c2-816b-6e792c06b2f1-must-gather-output\") pod \"must-gather-hktf7\" (UID: \"8d96ce40-7782-46c2-816b-6e792c06b2f1\") " pod="openshift-must-gather-jnf5s/must-gather-hktf7" Dec 06 00:21:20 crc kubenswrapper[4734]: I1206 00:21:20.431900 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8d96ce40-7782-46c2-816b-6e792c06b2f1-must-gather-output\") pod \"must-gather-hktf7\" (UID: \"8d96ce40-7782-46c2-816b-6e792c06b2f1\") " pod="openshift-must-gather-jnf5s/must-gather-hktf7" Dec 06 00:21:20 crc kubenswrapper[4734]: I1206 00:21:20.469633 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82w7j\" (UniqueName: \"kubernetes.io/projected/8d96ce40-7782-46c2-816b-6e792c06b2f1-kube-api-access-82w7j\") pod \"must-gather-hktf7\" (UID: \"8d96ce40-7782-46c2-816b-6e792c06b2f1\") " pod="openshift-must-gather-jnf5s/must-gather-hktf7" Dec 06 00:21:20 crc kubenswrapper[4734]: I1206 00:21:20.579029 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jnf5s/must-gather-hktf7" Dec 06 00:21:21 crc kubenswrapper[4734]: I1206 00:21:21.121389 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jnf5s/must-gather-hktf7"] Dec 06 00:21:21 crc kubenswrapper[4734]: I1206 00:21:21.136557 4734 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 00:21:21 crc kubenswrapper[4734]: I1206 00:21:21.663213 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jnf5s/must-gather-hktf7" event={"ID":"8d96ce40-7782-46c2-816b-6e792c06b2f1","Type":"ContainerStarted","Data":"7fb11b8276e994a49a6437d7346db917f652d9f8b06d0f44c0ed1510433a47b7"} Dec 06 00:21:26 crc kubenswrapper[4734]: I1206 00:21:26.725085 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jnf5s/must-gather-hktf7" event={"ID":"8d96ce40-7782-46c2-816b-6e792c06b2f1","Type":"ContainerStarted","Data":"41371b6cee252591242fc625ba513d815f17460aaec8ae08ca46e4cf73735cb7"} Dec 06 00:21:26 crc kubenswrapper[4734]: I1206 00:21:26.727306 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jnf5s/must-gather-hktf7" event={"ID":"8d96ce40-7782-46c2-816b-6e792c06b2f1","Type":"ContainerStarted","Data":"6e3cbdf545de6ed0d0a79997cac3743afad5ce1e25b03bb3b6f513afcf5c6fad"} Dec 06 00:21:26 crc kubenswrapper[4734]: I1206 00:21:26.750492 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jnf5s/must-gather-hktf7" podStartSLOduration=2.424505879 podStartE2EDuration="6.750456037s" podCreationTimestamp="2025-12-06 00:21:20 +0000 UTC" firstStartedPulling="2025-12-06 00:21:21.13622477 +0000 UTC m=+3701.819629046" lastFinishedPulling="2025-12-06 00:21:25.462174928 +0000 UTC m=+3706.145579204" observedRunningTime="2025-12-06 00:21:26.743765883 +0000 UTC m=+3707.427170179" watchObservedRunningTime="2025-12-06 00:21:26.750456037 +0000 UTC m=+3707.433860313" Dec 06 00:21:30 crc kubenswrapper[4734]: I1206 00:21:30.012975 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jnf5s/crc-debug-cf4tz"] Dec 06 00:21:30 crc kubenswrapper[4734]: I1206 00:21:30.016033 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jnf5s/crc-debug-cf4tz" Dec 06 00:21:30 crc kubenswrapper[4734]: I1206 00:21:30.123767 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80c215f5-dfdd-4e02-996c-d959cd34a703-host\") pod \"crc-debug-cf4tz\" (UID: \"80c215f5-dfdd-4e02-996c-d959cd34a703\") " pod="openshift-must-gather-jnf5s/crc-debug-cf4tz" Dec 06 00:21:30 crc kubenswrapper[4734]: I1206 00:21:30.124416 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt5d7\" (UniqueName: \"kubernetes.io/projected/80c215f5-dfdd-4e02-996c-d959cd34a703-kube-api-access-zt5d7\") pod \"crc-debug-cf4tz\" (UID: \"80c215f5-dfdd-4e02-996c-d959cd34a703\") " pod="openshift-must-gather-jnf5s/crc-debug-cf4tz" Dec 06 00:21:30 crc kubenswrapper[4734]: I1206 00:21:30.227393 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80c215f5-dfdd-4e02-996c-d959cd34a703-host\") pod \"crc-debug-cf4tz\" (UID: \"80c215f5-dfdd-4e02-996c-d959cd34a703\") " pod="openshift-must-gather-jnf5s/crc-debug-cf4tz" Dec 06 00:21:30 crc kubenswrapper[4734]: I1206 00:21:30.227596 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt5d7\" (UniqueName: \"kubernetes.io/projected/80c215f5-dfdd-4e02-996c-d959cd34a703-kube-api-access-zt5d7\") pod \"crc-debug-cf4tz\" (UID: \"80c215f5-dfdd-4e02-996c-d959cd34a703\") " pod="openshift-must-gather-jnf5s/crc-debug-cf4tz" Dec 06 00:21:30 crc kubenswrapper[4734]: I1206 00:21:30.227603 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80c215f5-dfdd-4e02-996c-d959cd34a703-host\") pod \"crc-debug-cf4tz\" (UID: \"80c215f5-dfdd-4e02-996c-d959cd34a703\") " pod="openshift-must-gather-jnf5s/crc-debug-cf4tz" Dec 06 00:21:30 crc kubenswrapper[4734]: I1206 00:21:30.248562 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt5d7\" (UniqueName: \"kubernetes.io/projected/80c215f5-dfdd-4e02-996c-d959cd34a703-kube-api-access-zt5d7\") pod \"crc-debug-cf4tz\" (UID: \"80c215f5-dfdd-4e02-996c-d959cd34a703\") " pod="openshift-must-gather-jnf5s/crc-debug-cf4tz" Dec 06 00:21:30 crc kubenswrapper[4734]: I1206 00:21:30.340216 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jnf5s/crc-debug-cf4tz" Dec 06 00:21:30 crc kubenswrapper[4734]: I1206 00:21:30.614020 4734 scope.go:117] "RemoveContainer" containerID="30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50" Dec 06 00:21:30 crc kubenswrapper[4734]: E1206 00:21:30.614446 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:21:30 crc kubenswrapper[4734]: I1206 00:21:30.765484 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jnf5s/crc-debug-cf4tz" event={"ID":"80c215f5-dfdd-4e02-996c-d959cd34a703","Type":"ContainerStarted","Data":"6249342f9947e279071ac0260e5fa355bd694fa3497f2101491b8782eed9c965"} Dec 06 00:21:43 crc kubenswrapper[4734]: I1206 00:21:43.614406 4734 scope.go:117] "RemoveContainer" containerID="30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50" Dec 06 00:21:43 crc kubenswrapper[4734]: E1206 00:21:43.615428 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:21:44 crc kubenswrapper[4734]: I1206 00:21:44.957695 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jnf5s/crc-debug-cf4tz" event={"ID":"80c215f5-dfdd-4e02-996c-d959cd34a703","Type":"ContainerStarted","Data":"1c59d1d92077d3c61211c8a08c504d7769ff732e119e2b9135eee4d5b8458ecf"} Dec 06 00:21:44 crc kubenswrapper[4734]: I1206 00:21:44.987120 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jnf5s/crc-debug-cf4tz" podStartSLOduration=1.8920579119999998 podStartE2EDuration="15.987090767s" podCreationTimestamp="2025-12-06 00:21:29 +0000 UTC" firstStartedPulling="2025-12-06 00:21:30.384144645 +0000 UTC m=+3711.067548921" lastFinishedPulling="2025-12-06 00:21:44.4791775 +0000 UTC m=+3725.162581776" observedRunningTime="2025-12-06 00:21:44.975935775 +0000 UTC m=+3725.659340051" watchObservedRunningTime="2025-12-06 00:21:44.987090767 +0000 UTC m=+3725.670495043" Dec 06 00:21:56 crc kubenswrapper[4734]: I1206 00:21:56.614492 4734 scope.go:117] "RemoveContainer" containerID="30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50" Dec 06 00:21:56 crc kubenswrapper[4734]: E1206 00:21:56.615426 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:22:07 crc kubenswrapper[4734]: I1206 00:22:07.614847 4734 scope.go:117] "RemoveContainer" containerID="30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50" Dec 06 00:22:07 crc kubenswrapper[4734]: E1206 00:22:07.616041 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:22:19 crc kubenswrapper[4734]: I1206 00:22:19.626946 4734 scope.go:117] "RemoveContainer" containerID="30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50" Dec 06 00:22:19 crc kubenswrapper[4734]: E1206 00:22:19.628005 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:22:30 crc kubenswrapper[4734]: I1206 00:22:30.437014 4734 generic.go:334] "Generic (PLEG): container finished" podID="80c215f5-dfdd-4e02-996c-d959cd34a703" containerID="1c59d1d92077d3c61211c8a08c504d7769ff732e119e2b9135eee4d5b8458ecf" exitCode=0 Dec 06 00:22:30 crc kubenswrapper[4734]: I1206 00:22:30.437110 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jnf5s/crc-debug-cf4tz" event={"ID":"80c215f5-dfdd-4e02-996c-d959cd34a703","Type":"ContainerDied","Data":"1c59d1d92077d3c61211c8a08c504d7769ff732e119e2b9135eee4d5b8458ecf"} Dec 06 00:22:30 crc kubenswrapper[4734]: I1206 00:22:30.614015 4734 scope.go:117] "RemoveContainer" containerID="30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50" Dec 06 00:22:30 crc kubenswrapper[4734]: E1206 00:22:30.614293 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:22:31 crc kubenswrapper[4734]: I1206 00:22:31.562414 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jnf5s/crc-debug-cf4tz" Dec 06 00:22:31 crc kubenswrapper[4734]: I1206 00:22:31.597910 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jnf5s/crc-debug-cf4tz"] Dec 06 00:22:31 crc kubenswrapper[4734]: I1206 00:22:31.607332 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jnf5s/crc-debug-cf4tz"] Dec 06 00:22:31 crc kubenswrapper[4734]: I1206 00:22:31.682576 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80c215f5-dfdd-4e02-996c-d959cd34a703-host\") pod \"80c215f5-dfdd-4e02-996c-d959cd34a703\" (UID: \"80c215f5-dfdd-4e02-996c-d959cd34a703\") " Dec 06 00:22:31 crc kubenswrapper[4734]: I1206 00:22:31.682731 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt5d7\" (UniqueName: \"kubernetes.io/projected/80c215f5-dfdd-4e02-996c-d959cd34a703-kube-api-access-zt5d7\") pod \"80c215f5-dfdd-4e02-996c-d959cd34a703\" (UID: \"80c215f5-dfdd-4e02-996c-d959cd34a703\") " Dec 06 00:22:31 crc kubenswrapper[4734]: I1206 00:22:31.684471 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/80c215f5-dfdd-4e02-996c-d959cd34a703-host" (OuterVolumeSpecName: "host") pod "80c215f5-dfdd-4e02-996c-d959cd34a703" (UID: "80c215f5-dfdd-4e02-996c-d959cd34a703"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 00:22:31 crc kubenswrapper[4734]: I1206 00:22:31.691781 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80c215f5-dfdd-4e02-996c-d959cd34a703-kube-api-access-zt5d7" (OuterVolumeSpecName: "kube-api-access-zt5d7") pod "80c215f5-dfdd-4e02-996c-d959cd34a703" (UID: "80c215f5-dfdd-4e02-996c-d959cd34a703"). InnerVolumeSpecName "kube-api-access-zt5d7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:22:31 crc kubenswrapper[4734]: I1206 00:22:31.785986 4734 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80c215f5-dfdd-4e02-996c-d959cd34a703-host\") on node \"crc\" DevicePath \"\"" Dec 06 00:22:31 crc kubenswrapper[4734]: I1206 00:22:31.786027 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zt5d7\" (UniqueName: \"kubernetes.io/projected/80c215f5-dfdd-4e02-996c-d959cd34a703-kube-api-access-zt5d7\") on node \"crc\" DevicePath \"\"" Dec 06 00:22:32 crc kubenswrapper[4734]: I1206 00:22:32.460241 4734 scope.go:117] "RemoveContainer" containerID="1c59d1d92077d3c61211c8a08c504d7769ff732e119e2b9135eee4d5b8458ecf" Dec 06 00:22:32 crc kubenswrapper[4734]: I1206 00:22:32.460284 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jnf5s/crc-debug-cf4tz" Dec 06 00:22:32 crc kubenswrapper[4734]: I1206 00:22:32.780357 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jnf5s/crc-debug-9fb2k"] Dec 06 00:22:32 crc kubenswrapper[4734]: E1206 00:22:32.780853 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80c215f5-dfdd-4e02-996c-d959cd34a703" containerName="container-00" Dec 06 00:22:32 crc kubenswrapper[4734]: I1206 00:22:32.780868 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="80c215f5-dfdd-4e02-996c-d959cd34a703" containerName="container-00" Dec 06 00:22:32 crc kubenswrapper[4734]: I1206 00:22:32.781110 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="80c215f5-dfdd-4e02-996c-d959cd34a703" containerName="container-00" Dec 06 00:22:32 crc kubenswrapper[4734]: I1206 00:22:32.781958 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jnf5s/crc-debug-9fb2k" Dec 06 00:22:32 crc kubenswrapper[4734]: I1206 00:22:32.910363 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgqh8\" (UniqueName: \"kubernetes.io/projected/15eeeddf-313a-45d9-8616-9b0a2adf6338-kube-api-access-kgqh8\") pod \"crc-debug-9fb2k\" (UID: \"15eeeddf-313a-45d9-8616-9b0a2adf6338\") " pod="openshift-must-gather-jnf5s/crc-debug-9fb2k" Dec 06 00:22:32 crc kubenswrapper[4734]: I1206 00:22:32.910595 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/15eeeddf-313a-45d9-8616-9b0a2adf6338-host\") pod \"crc-debug-9fb2k\" (UID: \"15eeeddf-313a-45d9-8616-9b0a2adf6338\") " pod="openshift-must-gather-jnf5s/crc-debug-9fb2k" Dec 06 00:22:33 crc kubenswrapper[4734]: I1206 00:22:33.012312 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/15eeeddf-313a-45d9-8616-9b0a2adf6338-host\") pod \"crc-debug-9fb2k\" (UID: \"15eeeddf-313a-45d9-8616-9b0a2adf6338\") " pod="openshift-must-gather-jnf5s/crc-debug-9fb2k" Dec 06 00:22:33 crc kubenswrapper[4734]: I1206 00:22:33.012463 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgqh8\" (UniqueName: \"kubernetes.io/projected/15eeeddf-313a-45d9-8616-9b0a2adf6338-kube-api-access-kgqh8\") pod \"crc-debug-9fb2k\" (UID: \"15eeeddf-313a-45d9-8616-9b0a2adf6338\") " pod="openshift-must-gather-jnf5s/crc-debug-9fb2k" Dec 06 00:22:33 crc kubenswrapper[4734]: I1206 00:22:33.012585 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/15eeeddf-313a-45d9-8616-9b0a2adf6338-host\") pod \"crc-debug-9fb2k\" (UID: \"15eeeddf-313a-45d9-8616-9b0a2adf6338\") " pod="openshift-must-gather-jnf5s/crc-debug-9fb2k" Dec 06 00:22:33 crc kubenswrapper[4734]: I1206 00:22:33.033217 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgqh8\" (UniqueName: \"kubernetes.io/projected/15eeeddf-313a-45d9-8616-9b0a2adf6338-kube-api-access-kgqh8\") pod \"crc-debug-9fb2k\" (UID: \"15eeeddf-313a-45d9-8616-9b0a2adf6338\") " pod="openshift-must-gather-jnf5s/crc-debug-9fb2k" Dec 06 00:22:33 crc kubenswrapper[4734]: I1206 00:22:33.103613 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jnf5s/crc-debug-9fb2k" Dec 06 00:22:33 crc kubenswrapper[4734]: I1206 00:22:33.474982 4734 generic.go:334] "Generic (PLEG): container finished" podID="15eeeddf-313a-45d9-8616-9b0a2adf6338" containerID="bc969b76a100f0f9add329a6e69318987c7d1a46cd1f297a1af7dbce2ba20782" exitCode=0 Dec 06 00:22:33 crc kubenswrapper[4734]: I1206 00:22:33.475069 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jnf5s/crc-debug-9fb2k" event={"ID":"15eeeddf-313a-45d9-8616-9b0a2adf6338","Type":"ContainerDied","Data":"bc969b76a100f0f9add329a6e69318987c7d1a46cd1f297a1af7dbce2ba20782"} Dec 06 00:22:33 crc kubenswrapper[4734]: I1206 00:22:33.475446 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jnf5s/crc-debug-9fb2k" event={"ID":"15eeeddf-313a-45d9-8616-9b0a2adf6338","Type":"ContainerStarted","Data":"63e748eca09d3bb366d59f03d9a93c8c0f767ce7e2dd2b77c69b24fcd3cb87e4"} Dec 06 00:22:33 crc kubenswrapper[4734]: I1206 00:22:33.646670 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80c215f5-dfdd-4e02-996c-d959cd34a703" path="/var/lib/kubelet/pods/80c215f5-dfdd-4e02-996c-d959cd34a703/volumes" Dec 06 00:22:34 crc kubenswrapper[4734]: I1206 00:22:34.058707 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jnf5s/crc-debug-9fb2k"] Dec 06 00:22:34 crc kubenswrapper[4734]: I1206 00:22:34.072571 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jnf5s/crc-debug-9fb2k"] Dec 06 00:22:34 crc kubenswrapper[4734]: I1206 00:22:34.597977 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jnf5s/crc-debug-9fb2k" Dec 06 00:22:34 crc kubenswrapper[4734]: I1206 00:22:34.672672 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/15eeeddf-313a-45d9-8616-9b0a2adf6338-host\") pod \"15eeeddf-313a-45d9-8616-9b0a2adf6338\" (UID: \"15eeeddf-313a-45d9-8616-9b0a2adf6338\") " Dec 06 00:22:34 crc kubenswrapper[4734]: I1206 00:22:34.672801 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15eeeddf-313a-45d9-8616-9b0a2adf6338-host" (OuterVolumeSpecName: "host") pod "15eeeddf-313a-45d9-8616-9b0a2adf6338" (UID: "15eeeddf-313a-45d9-8616-9b0a2adf6338"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 00:22:34 crc kubenswrapper[4734]: I1206 00:22:34.672963 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgqh8\" (UniqueName: \"kubernetes.io/projected/15eeeddf-313a-45d9-8616-9b0a2adf6338-kube-api-access-kgqh8\") pod \"15eeeddf-313a-45d9-8616-9b0a2adf6338\" (UID: \"15eeeddf-313a-45d9-8616-9b0a2adf6338\") " Dec 06 00:22:34 crc kubenswrapper[4734]: I1206 00:22:34.674841 4734 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/15eeeddf-313a-45d9-8616-9b0a2adf6338-host\") on node \"crc\" DevicePath \"\"" Dec 06 00:22:34 crc kubenswrapper[4734]: I1206 00:22:34.680306 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15eeeddf-313a-45d9-8616-9b0a2adf6338-kube-api-access-kgqh8" (OuterVolumeSpecName: "kube-api-access-kgqh8") pod "15eeeddf-313a-45d9-8616-9b0a2adf6338" (UID: "15eeeddf-313a-45d9-8616-9b0a2adf6338"). InnerVolumeSpecName "kube-api-access-kgqh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:22:34 crc kubenswrapper[4734]: I1206 00:22:34.777588 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgqh8\" (UniqueName: \"kubernetes.io/projected/15eeeddf-313a-45d9-8616-9b0a2adf6338-kube-api-access-kgqh8\") on node \"crc\" DevicePath \"\"" Dec 06 00:22:35 crc kubenswrapper[4734]: I1206 00:22:35.241814 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jnf5s/crc-debug-bb4bq"] Dec 06 00:22:35 crc kubenswrapper[4734]: E1206 00:22:35.242343 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15eeeddf-313a-45d9-8616-9b0a2adf6338" containerName="container-00" Dec 06 00:22:35 crc kubenswrapper[4734]: I1206 00:22:35.242359 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="15eeeddf-313a-45d9-8616-9b0a2adf6338" containerName="container-00" Dec 06 00:22:35 crc kubenswrapper[4734]: I1206 00:22:35.242709 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="15eeeddf-313a-45d9-8616-9b0a2adf6338" containerName="container-00" Dec 06 00:22:35 crc kubenswrapper[4734]: I1206 00:22:35.243592 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jnf5s/crc-debug-bb4bq" Dec 06 00:22:35 crc kubenswrapper[4734]: I1206 00:22:35.389245 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/558f03d8-9b38-4f41-8b98-1efcaf4242b7-host\") pod \"crc-debug-bb4bq\" (UID: \"558f03d8-9b38-4f41-8b98-1efcaf4242b7\") " pod="openshift-must-gather-jnf5s/crc-debug-bb4bq" Dec 06 00:22:35 crc kubenswrapper[4734]: I1206 00:22:35.389398 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvzwm\" (UniqueName: \"kubernetes.io/projected/558f03d8-9b38-4f41-8b98-1efcaf4242b7-kube-api-access-dvzwm\") pod \"crc-debug-bb4bq\" (UID: \"558f03d8-9b38-4f41-8b98-1efcaf4242b7\") " pod="openshift-must-gather-jnf5s/crc-debug-bb4bq" Dec 06 00:22:35 crc kubenswrapper[4734]: I1206 00:22:35.493306 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvzwm\" (UniqueName: \"kubernetes.io/projected/558f03d8-9b38-4f41-8b98-1efcaf4242b7-kube-api-access-dvzwm\") pod \"crc-debug-bb4bq\" (UID: \"558f03d8-9b38-4f41-8b98-1efcaf4242b7\") " pod="openshift-must-gather-jnf5s/crc-debug-bb4bq" Dec 06 00:22:35 crc kubenswrapper[4734]: I1206 00:22:35.493510 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/558f03d8-9b38-4f41-8b98-1efcaf4242b7-host\") pod \"crc-debug-bb4bq\" (UID: \"558f03d8-9b38-4f41-8b98-1efcaf4242b7\") " pod="openshift-must-gather-jnf5s/crc-debug-bb4bq" Dec 06 00:22:35 crc kubenswrapper[4734]: I1206 00:22:35.493623 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/558f03d8-9b38-4f41-8b98-1efcaf4242b7-host\") pod \"crc-debug-bb4bq\" (UID: \"558f03d8-9b38-4f41-8b98-1efcaf4242b7\") " pod="openshift-must-gather-jnf5s/crc-debug-bb4bq" Dec 06 00:22:35 crc kubenswrapper[4734]: I1206 00:22:35.497463 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63e748eca09d3bb366d59f03d9a93c8c0f767ce7e2dd2b77c69b24fcd3cb87e4" Dec 06 00:22:35 crc kubenswrapper[4734]: I1206 00:22:35.497575 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jnf5s/crc-debug-9fb2k" Dec 06 00:22:35 crc kubenswrapper[4734]: I1206 00:22:35.513695 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvzwm\" (UniqueName: \"kubernetes.io/projected/558f03d8-9b38-4f41-8b98-1efcaf4242b7-kube-api-access-dvzwm\") pod \"crc-debug-bb4bq\" (UID: \"558f03d8-9b38-4f41-8b98-1efcaf4242b7\") " pod="openshift-must-gather-jnf5s/crc-debug-bb4bq" Dec 06 00:22:35 crc kubenswrapper[4734]: I1206 00:22:35.565019 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jnf5s/crc-debug-bb4bq" Dec 06 00:22:35 crc kubenswrapper[4734]: W1206 00:22:35.592557 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod558f03d8_9b38_4f41_8b98_1efcaf4242b7.slice/crio-3fbf8692a4a294471a10fccdf8d2dcc8a9f77b36ee566c41dca322a270dda75f WatchSource:0}: Error finding container 3fbf8692a4a294471a10fccdf8d2dcc8a9f77b36ee566c41dca322a270dda75f: Status 404 returned error can't find the container with id 3fbf8692a4a294471a10fccdf8d2dcc8a9f77b36ee566c41dca322a270dda75f Dec 06 00:22:35 crc kubenswrapper[4734]: I1206 00:22:35.629042 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15eeeddf-313a-45d9-8616-9b0a2adf6338" path="/var/lib/kubelet/pods/15eeeddf-313a-45d9-8616-9b0a2adf6338/volumes" Dec 06 00:22:36 crc kubenswrapper[4734]: I1206 00:22:36.513512 4734 generic.go:334] "Generic (PLEG): container finished" podID="558f03d8-9b38-4f41-8b98-1efcaf4242b7" containerID="32561cb031356c9aaadde50a77058947922f4996049f2ce9cee9ab14ff7b801e" exitCode=0 Dec 06 00:22:36 crc kubenswrapper[4734]: I1206 00:22:36.513609 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jnf5s/crc-debug-bb4bq" event={"ID":"558f03d8-9b38-4f41-8b98-1efcaf4242b7","Type":"ContainerDied","Data":"32561cb031356c9aaadde50a77058947922f4996049f2ce9cee9ab14ff7b801e"} Dec 06 00:22:36 crc kubenswrapper[4734]: I1206 00:22:36.514025 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jnf5s/crc-debug-bb4bq" event={"ID":"558f03d8-9b38-4f41-8b98-1efcaf4242b7","Type":"ContainerStarted","Data":"3fbf8692a4a294471a10fccdf8d2dcc8a9f77b36ee566c41dca322a270dda75f"} Dec 06 00:22:36 crc kubenswrapper[4734]: I1206 00:22:36.565944 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jnf5s/crc-debug-bb4bq"] Dec 06 00:22:36 crc kubenswrapper[4734]: I1206 00:22:36.576335 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jnf5s/crc-debug-bb4bq"] Dec 06 00:22:38 crc kubenswrapper[4734]: I1206 00:22:38.105389 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jnf5s/crc-debug-bb4bq" Dec 06 00:22:38 crc kubenswrapper[4734]: I1206 00:22:38.256686 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvzwm\" (UniqueName: \"kubernetes.io/projected/558f03d8-9b38-4f41-8b98-1efcaf4242b7-kube-api-access-dvzwm\") pod \"558f03d8-9b38-4f41-8b98-1efcaf4242b7\" (UID: \"558f03d8-9b38-4f41-8b98-1efcaf4242b7\") " Dec 06 00:22:38 crc kubenswrapper[4734]: I1206 00:22:38.256928 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/558f03d8-9b38-4f41-8b98-1efcaf4242b7-host\") pod \"558f03d8-9b38-4f41-8b98-1efcaf4242b7\" (UID: \"558f03d8-9b38-4f41-8b98-1efcaf4242b7\") " Dec 06 00:22:38 crc kubenswrapper[4734]: I1206 00:22:38.257344 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/558f03d8-9b38-4f41-8b98-1efcaf4242b7-host" (OuterVolumeSpecName: "host") pod "558f03d8-9b38-4f41-8b98-1efcaf4242b7" (UID: "558f03d8-9b38-4f41-8b98-1efcaf4242b7"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 00:22:38 crc kubenswrapper[4734]: I1206 00:22:38.258201 4734 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/558f03d8-9b38-4f41-8b98-1efcaf4242b7-host\") on node \"crc\" DevicePath \"\"" Dec 06 00:22:38 crc kubenswrapper[4734]: I1206 00:22:38.263701 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/558f03d8-9b38-4f41-8b98-1efcaf4242b7-kube-api-access-dvzwm" (OuterVolumeSpecName: "kube-api-access-dvzwm") pod "558f03d8-9b38-4f41-8b98-1efcaf4242b7" (UID: "558f03d8-9b38-4f41-8b98-1efcaf4242b7"). InnerVolumeSpecName "kube-api-access-dvzwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:22:38 crc kubenswrapper[4734]: I1206 00:22:38.360505 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvzwm\" (UniqueName: \"kubernetes.io/projected/558f03d8-9b38-4f41-8b98-1efcaf4242b7-kube-api-access-dvzwm\") on node \"crc\" DevicePath \"\"" Dec 06 00:22:38 crc kubenswrapper[4734]: I1206 00:22:38.537453 4734 scope.go:117] "RemoveContainer" containerID="32561cb031356c9aaadde50a77058947922f4996049f2ce9cee9ab14ff7b801e" Dec 06 00:22:38 crc kubenswrapper[4734]: I1206 00:22:38.537473 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jnf5s/crc-debug-bb4bq" Dec 06 00:22:39 crc kubenswrapper[4734]: I1206 00:22:39.635740 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="558f03d8-9b38-4f41-8b98-1efcaf4242b7" path="/var/lib/kubelet/pods/558f03d8-9b38-4f41-8b98-1efcaf4242b7/volumes" Dec 06 00:22:41 crc kubenswrapper[4734]: I1206 00:22:41.614348 4734 scope.go:117] "RemoveContainer" containerID="30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50" Dec 06 00:22:41 crc kubenswrapper[4734]: E1206 00:22:41.615867 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:22:55 crc kubenswrapper[4734]: I1206 00:22:55.302150 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5574c9fdf8-q682b_d3c7aa3a-ca07-4476-8b39-06479afae42d/barbican-api/0.log" Dec 06 00:22:55 crc kubenswrapper[4734]: I1206 00:22:55.474762 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5d4f95c8c8-c5lws_0266e747-392d-46c1-bc3e-0ef614db01e3/barbican-keystone-listener/0.log" Dec 06 00:22:55 crc kubenswrapper[4734]: I1206 00:22:55.497639 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5574c9fdf8-q682b_d3c7aa3a-ca07-4476-8b39-06479afae42d/barbican-api-log/0.log" Dec 06 00:22:55 crc kubenswrapper[4734]: I1206 00:22:55.562965 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5d4f95c8c8-c5lws_0266e747-392d-46c1-bc3e-0ef614db01e3/barbican-keystone-listener-log/0.log" Dec 06 00:22:55 crc kubenswrapper[4734]: I1206 00:22:55.614858 4734 scope.go:117] "RemoveContainer" containerID="30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50" Dec 06 00:22:55 crc kubenswrapper[4734]: E1206 00:22:55.615156 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:22:55 crc kubenswrapper[4734]: I1206 00:22:55.754170 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-57c5555847-t5zf4_b35b4bd8-efbd-4f96-9962-490ea41d44d1/barbican-worker/0.log" Dec 06 00:22:55 crc kubenswrapper[4734]: I1206 00:22:55.810958 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-57c5555847-t5zf4_b35b4bd8-efbd-4f96-9962-490ea41d44d1/barbican-worker-log/0.log" Dec 06 00:22:56 crc kubenswrapper[4734]: I1206 00:22:56.109577 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw_faef139d-614e-4c50-a383-8dd231a47b83/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:22:56 crc kubenswrapper[4734]: I1206 00:22:56.179892 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_90a03731-2e0d-4698-a55e-0af3ef5372be/ceilometer-notification-agent/0.log" Dec 06 00:22:56 crc kubenswrapper[4734]: I1206 00:22:56.182320 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_90a03731-2e0d-4698-a55e-0af3ef5372be/ceilometer-central-agent/0.log" Dec 06 00:22:56 crc kubenswrapper[4734]: I1206 00:22:56.327894 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_90a03731-2e0d-4698-a55e-0af3ef5372be/proxy-httpd/0.log" Dec 06 00:22:56 crc kubenswrapper[4734]: I1206 00:22:56.359434 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_90a03731-2e0d-4698-a55e-0af3ef5372be/sg-core/0.log" Dec 06 00:22:56 crc kubenswrapper[4734]: I1206 00:22:56.489273 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_6c79a17a-a1f1-481f-90de-cdcfe632a079/cinder-api/0.log" Dec 06 00:22:56 crc kubenswrapper[4734]: I1206 00:22:56.550778 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_6c79a17a-a1f1-481f-90de-cdcfe632a079/cinder-api-log/0.log" Dec 06 00:22:56 crc kubenswrapper[4734]: I1206 00:22:56.741113 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-db-purge-29416321-tqhwd_3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124/cinder-db-purge/0.log" Dec 06 00:22:56 crc kubenswrapper[4734]: I1206 00:22:56.849891 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d36cae72-5806-4d9c-80a9-c396c5ca00d6/cinder-scheduler/0.log" Dec 06 00:22:56 crc kubenswrapper[4734]: I1206 00:22:56.975167 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d36cae72-5806-4d9c-80a9-c396c5ca00d6/probe/0.log" Dec 06 00:22:57 crc kubenswrapper[4734]: I1206 00:22:57.030941 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq_f183bc38-e046-45f6-b96a-440e596c8088/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:22:57 crc kubenswrapper[4734]: I1206 00:22:57.257020 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9_6de30094-9f75-467b-a935-3abbdf98e94c/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:22:57 crc kubenswrapper[4734]: I1206 00:22:57.352058 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-p7nf9_83b046ba-a4ad-4e9b-b266-a23db4ef72ae/init/0.log" Dec 06 00:22:57 crc kubenswrapper[4734]: I1206 00:22:57.548578 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-p7nf9_83b046ba-a4ad-4e9b-b266-a23db4ef72ae/init/0.log" Dec 06 00:22:57 crc kubenswrapper[4734]: I1206 00:22:57.608877 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-p7nf9_83b046ba-a4ad-4e9b-b266-a23db4ef72ae/dnsmasq-dns/0.log" Dec 06 00:22:57 crc kubenswrapper[4734]: I1206 00:22:57.635809 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k_b881d911-43a8-4290-98e8-89e268e162e4/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:22:58 crc kubenswrapper[4734]: I1206 00:22:58.072345 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-purge-29416321-zr2lx_e30aee89-812f-4e60-997e-54de845b7afe/glance-dbpurge/0.log" Dec 06 00:22:58 crc kubenswrapper[4734]: I1206 00:22:58.115811 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_eddf1584-198a-4279-a09a-30500f1842f3/glance-httpd/0.log" Dec 06 00:22:58 crc kubenswrapper[4734]: I1206 00:22:58.350786 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_eddf1584-198a-4279-a09a-30500f1842f3/glance-log/0.log" Dec 06 00:22:58 crc kubenswrapper[4734]: I1206 00:22:58.397835 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d6b10458-86e7-4568-b3b5-2a3e090b90a8/glance-httpd/0.log" Dec 06 00:22:58 crc kubenswrapper[4734]: I1206 00:22:58.459500 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d6b10458-86e7-4568-b3b5-2a3e090b90a8/glance-log/0.log" Dec 06 00:22:58 crc kubenswrapper[4734]: I1206 00:22:58.792776 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-755fc898d8-dlnbz_bbcbbde9-55c9-48dc-866d-ab670775e9b3/horizon/0.log" Dec 06 00:22:58 crc kubenswrapper[4734]: I1206 00:22:58.955209 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv_131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:22:59 crc kubenswrapper[4734]: I1206 00:22:59.108867 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-755fc898d8-dlnbz_bbcbbde9-55c9-48dc-866d-ab670775e9b3/horizon-log/0.log" Dec 06 00:22:59 crc kubenswrapper[4734]: I1206 00:22:59.167785 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-lqrnf_43caeb9a-1d22-41be-abb1-48b4881e6afb/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:22:59 crc kubenswrapper[4734]: I1206 00:22:59.382589 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29416321-nx5mh_0e2a8f39-3819-46e4-9f5c-b2378637486f/keystone-cron/0.log" Dec 06 00:22:59 crc kubenswrapper[4734]: I1206 00:22:59.574595 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-fffd48d8f-srcmr_26447265-57c1-45c6-bbef-cf7b2a82ed85/keystone-api/0.log" Dec 06 00:22:59 crc kubenswrapper[4734]: I1206 00:22:59.717558 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_e4c76f3a-43a9-43fc-be28-d7d3081d5e39/kube-state-metrics/0.log" Dec 06 00:22:59 crc kubenswrapper[4734]: I1206 00:22:59.760214 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk_85f32997-f801-4f60-b010-aaff637a8292/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:23:00 crc kubenswrapper[4734]: I1206 00:23:00.259249 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67d74d57d5-4s4p7_f4201381-aab2-40da-9f4a-dc31e8874266/neutron-api/0.log" Dec 06 00:23:00 crc kubenswrapper[4734]: I1206 00:23:00.281699 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67d74d57d5-4s4p7_f4201381-aab2-40da-9f4a-dc31e8874266/neutron-httpd/0.log" Dec 06 00:23:00 crc kubenswrapper[4734]: I1206 00:23:00.325768 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2_e4c89d06-2d3b-47f8-bc2e-fa34a9d89453/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:23:00 crc kubenswrapper[4734]: I1206 00:23:00.915889 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_97634e74-2d01-49ae-b584-650725749027/nova-cell0-conductor-conductor/0.log" Dec 06 00:23:00 crc kubenswrapper[4734]: I1206 00:23:00.973958 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_fe37850d-71e6-4310-9c74-b98b792cecc4/nova-api-log/0.log" Dec 06 00:23:01 crc kubenswrapper[4734]: I1206 00:23:01.195098 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-db-purge-29416320-kjz69_e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5/nova-manage/0.log" Dec 06 00:23:01 crc kubenswrapper[4734]: I1206 00:23:01.291967 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_fe37850d-71e6-4310-9c74-b98b792cecc4/nova-api-api/0.log" Dec 06 00:23:01 crc kubenswrapper[4734]: I1206 00:23:01.355340 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_b801f420-78a0-4564-9339-fca1170a01d7/nova-cell1-conductor-conductor/0.log" Dec 06 00:23:01 crc kubenswrapper[4734]: I1206 00:23:01.513374 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-db-purge-29416320-x2nsf_1d498a8e-4ace-4a26-9c32-2dbc411c0b50/nova-manage/0.log" Dec 06 00:23:01 crc kubenswrapper[4734]: I1206 00:23:01.980733 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_41bee178-e2d7-4047-9c0a-429dc21411ed/nova-cell1-novncproxy-novncproxy/0.log" Dec 06 00:23:02 crc kubenswrapper[4734]: I1206 00:23:02.241506 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-l7kts_7d966291-cd7e-47ce-a95e-bee879371108/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:23:02 crc kubenswrapper[4734]: I1206 00:23:02.460664 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_6929b8c5-4cb9-49cd-a084-d578657ce0bf/nova-metadata-log/0.log" Dec 06 00:23:02 crc kubenswrapper[4734]: I1206 00:23:02.759133 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_77364fbf-3dbe-45c3-adf1-94410f61f0ce/nova-scheduler-scheduler/0.log" Dec 06 00:23:02 crc kubenswrapper[4734]: I1206 00:23:02.842122 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3cc9e4dc-431f-4963-911b-f6262ac3c6b5/mysql-bootstrap/0.log" Dec 06 00:23:03 crc kubenswrapper[4734]: I1206 00:23:03.032778 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3cc9e4dc-431f-4963-911b-f6262ac3c6b5/mysql-bootstrap/0.log" Dec 06 00:23:03 crc kubenswrapper[4734]: I1206 00:23:03.123765 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3cc9e4dc-431f-4963-911b-f6262ac3c6b5/galera/0.log" Dec 06 00:23:03 crc kubenswrapper[4734]: I1206 00:23:03.312766 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9fd725c7-f12a-4504-a71d-46e7d0258af7/mysql-bootstrap/0.log" Dec 06 00:23:03 crc kubenswrapper[4734]: I1206 00:23:03.471492 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9fd725c7-f12a-4504-a71d-46e7d0258af7/mysql-bootstrap/0.log" Dec 06 00:23:03 crc kubenswrapper[4734]: I1206 00:23:03.488538 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9fd725c7-f12a-4504-a71d-46e7d0258af7/galera/0.log" Dec 06 00:23:03 crc kubenswrapper[4734]: I1206 00:23:03.701944 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_1538ece1-e24d-4f20-b92d-0b526d1f5698/openstackclient/0.log" Dec 06 00:23:03 crc kubenswrapper[4734]: I1206 00:23:03.823176 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-587wk_625f2253-5867-4d61-a436-264a79c0bd94/ovn-controller/0.log" Dec 06 00:23:03 crc kubenswrapper[4734]: I1206 00:23:03.909157 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_6929b8c5-4cb9-49cd-a084-d578657ce0bf/nova-metadata-metadata/0.log" Dec 06 00:23:03 crc kubenswrapper[4734]: I1206 00:23:03.992675 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-cpzs4_b246fed6-9a79-4d72-a73a-943b13d8e30b/openstack-network-exporter/0.log" Dec 06 00:23:04 crc kubenswrapper[4734]: I1206 00:23:04.190758 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tpdrq_9631bcf5-05df-4e1d-b849-7352ef35013f/ovsdb-server-init/0.log" Dec 06 00:23:04 crc kubenswrapper[4734]: I1206 00:23:04.505655 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tpdrq_9631bcf5-05df-4e1d-b849-7352ef35013f/ovs-vswitchd/0.log" Dec 06 00:23:04 crc kubenswrapper[4734]: I1206 00:23:04.530972 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tpdrq_9631bcf5-05df-4e1d-b849-7352ef35013f/ovsdb-server/0.log" Dec 06 00:23:04 crc kubenswrapper[4734]: I1206 00:23:04.531323 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tpdrq_9631bcf5-05df-4e1d-b849-7352ef35013f/ovsdb-server-init/0.log" Dec 06 00:23:04 crc kubenswrapper[4734]: I1206 00:23:04.814265 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-dxg77_4b772014-ade2-4ef1-9795-8a6eb255f57f/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:23:04 crc kubenswrapper[4734]: I1206 00:23:04.819187 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_bf6b4283-12e2-489b-9808-9b4f21a2c080/openstack-network-exporter/0.log" Dec 06 00:23:04 crc kubenswrapper[4734]: I1206 00:23:04.858739 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_bf6b4283-12e2-489b-9808-9b4f21a2c080/ovn-northd/0.log" Dec 06 00:23:05 crc kubenswrapper[4734]: I1206 00:23:05.106349 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3fceddd4-e096-4a7e-875f-756279962334/openstack-network-exporter/0.log" Dec 06 00:23:05 crc kubenswrapper[4734]: I1206 00:23:05.125986 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3fceddd4-e096-4a7e-875f-756279962334/ovsdbserver-nb/0.log" Dec 06 00:23:05 crc kubenswrapper[4734]: I1206 00:23:05.348496 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350/openstack-network-exporter/0.log" Dec 06 00:23:05 crc kubenswrapper[4734]: I1206 00:23:05.459642 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350/ovsdbserver-sb/0.log" Dec 06 00:23:05 crc kubenswrapper[4734]: I1206 00:23:05.473947 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-85fbcb99c8-4gdvt_ccbbcfb6-1ffd-4c8e-8945-9d496467e46a/placement-api/0.log" Dec 06 00:23:05 crc kubenswrapper[4734]: I1206 00:23:05.706661 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_34a9d7ac-2a42-4352-8eb3-23d34cfc5696/setup-container/0.log" Dec 06 00:23:05 crc kubenswrapper[4734]: I1206 00:23:05.793083 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-85fbcb99c8-4gdvt_ccbbcfb6-1ffd-4c8e-8945-9d496467e46a/placement-log/0.log" Dec 06 00:23:06 crc kubenswrapper[4734]: I1206 00:23:06.110715 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_34a9d7ac-2a42-4352-8eb3-23d34cfc5696/setup-container/0.log" Dec 06 00:23:06 crc kubenswrapper[4734]: I1206 00:23:06.154997 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_556dbce3-075c-473a-ab0d-ea67ffc3e144/setup-container/0.log" Dec 06 00:23:06 crc kubenswrapper[4734]: I1206 00:23:06.190115 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_34a9d7ac-2a42-4352-8eb3-23d34cfc5696/rabbitmq/0.log" Dec 06 00:23:06 crc kubenswrapper[4734]: I1206 00:23:06.465960 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_556dbce3-075c-473a-ab0d-ea67ffc3e144/setup-container/0.log" Dec 06 00:23:06 crc kubenswrapper[4734]: I1206 00:23:06.548635 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_556dbce3-075c-473a-ab0d-ea67ffc3e144/rabbitmq/0.log" Dec 06 00:23:06 crc kubenswrapper[4734]: I1206 00:23:06.582657 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w_b9d39a80-01a8-421a-afac-94171314c0e1/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:23:06 crc kubenswrapper[4734]: I1206 00:23:06.794886 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p_12cd9906-9f9f-42ba-8869-54f39ae29366/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:23:06 crc kubenswrapper[4734]: I1206 00:23:06.803722 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-z9lz9_29e8f09f-ca59-420f-ae3c-8bdb696d653a/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:23:07 crc kubenswrapper[4734]: I1206 00:23:07.062666 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-pmm9q_378f4ff2-7e86-40ca-b771-155a02f5cb45/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:23:07 crc kubenswrapper[4734]: I1206 00:23:07.101388 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-mtqlw_d16abc61-9f6e-4980-9821-af436f2501fe/ssh-known-hosts-edpm-deployment/0.log" Dec 06 00:23:07 crc kubenswrapper[4734]: I1206 00:23:07.363452 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-66674dc5bc-l642k_d955842c-e3a2-4a05-a380-78c6f2fbdf3b/proxy-server/0.log" Dec 06 00:23:07 crc kubenswrapper[4734]: I1206 00:23:07.530186 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-66674dc5bc-l642k_d955842c-e3a2-4a05-a380-78c6f2fbdf3b/proxy-httpd/0.log" Dec 06 00:23:07 crc kubenswrapper[4734]: I1206 00:23:07.689319 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-qdl57_a1e03821-b44b-4ce9-8fb9-6831bf8b087f/swift-ring-rebalance/0.log" Dec 06 00:23:07 crc kubenswrapper[4734]: I1206 00:23:07.749803 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/account-reaper/0.log" Dec 06 00:23:07 crc kubenswrapper[4734]: I1206 00:23:07.791833 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/account-auditor/0.log" Dec 06 00:23:07 crc kubenswrapper[4734]: I1206 00:23:07.941342 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/account-server/0.log" Dec 06 00:23:07 crc kubenswrapper[4734]: I1206 00:23:07.960302 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/account-replicator/0.log" Dec 06 00:23:07 crc kubenswrapper[4734]: I1206 00:23:07.976251 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/container-auditor/0.log" Dec 06 00:23:08 crc kubenswrapper[4734]: I1206 00:23:08.111484 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/container-replicator/0.log" Dec 06 00:23:08 crc kubenswrapper[4734]: I1206 00:23:08.168893 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/container-server/0.log" Dec 06 00:23:08 crc kubenswrapper[4734]: I1206 00:23:08.190733 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/container-updater/0.log" Dec 06 00:23:08 crc kubenswrapper[4734]: I1206 00:23:08.300006 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/object-auditor/0.log" Dec 06 00:23:08 crc kubenswrapper[4734]: I1206 00:23:08.357358 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/object-expirer/0.log" Dec 06 00:23:08 crc kubenswrapper[4734]: I1206 00:23:08.395181 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/object-server/0.log" Dec 06 00:23:08 crc kubenswrapper[4734]: I1206 00:23:08.460725 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/object-replicator/0.log" Dec 06 00:23:08 crc kubenswrapper[4734]: I1206 00:23:08.532881 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/object-updater/0.log" Dec 06 00:23:08 crc kubenswrapper[4734]: I1206 00:23:08.608766 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/swift-recon-cron/0.log" Dec 06 00:23:08 crc kubenswrapper[4734]: I1206 00:23:08.609706 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/rsync/0.log" Dec 06 00:23:08 crc kubenswrapper[4734]: I1206 00:23:08.834940 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_5d24dfd1-9ec6-4419-84c6-577deb60b95f/tempest-tests-tempest-tests-runner/0.log" Dec 06 00:23:08 crc kubenswrapper[4734]: I1206 00:23:08.845842 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb_039811b0-a938-445d-b5a4-702b526f8356/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:23:09 crc kubenswrapper[4734]: I1206 00:23:09.122975 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_e615b54d-cff3-4de2-8569-9c492e2234e0/test-operator-logs-container/0.log" Dec 06 00:23:09 crc kubenswrapper[4734]: I1206 00:23:09.219102 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-45qzx_cc15ec12-e046-4933-beec-886e0868c644/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:23:10 crc kubenswrapper[4734]: I1206 00:23:10.615517 4734 scope.go:117] "RemoveContainer" containerID="30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50" Dec 06 00:23:10 crc kubenswrapper[4734]: E1206 00:23:10.616241 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:23:17 crc kubenswrapper[4734]: I1206 00:23:17.960629 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_61801e1d-6a79-497f-822b-69b683c2f78b/memcached/0.log" Dec 06 00:23:23 crc kubenswrapper[4734]: I1206 00:23:23.614471 4734 scope.go:117] "RemoveContainer" containerID="30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50" Dec 06 00:23:23 crc kubenswrapper[4734]: E1206 00:23:23.615444 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:23:37 crc kubenswrapper[4734]: I1206 00:23:37.601924 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x_d6a5b5d0-ee84-4715-8024-25698133af6b/util/0.log" Dec 06 00:23:37 crc kubenswrapper[4734]: I1206 00:23:37.748860 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x_d6a5b5d0-ee84-4715-8024-25698133af6b/util/0.log" Dec 06 00:23:37 crc kubenswrapper[4734]: I1206 00:23:37.807737 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x_d6a5b5d0-ee84-4715-8024-25698133af6b/pull/0.log" Dec 06 00:23:37 crc kubenswrapper[4734]: I1206 00:23:37.879014 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x_d6a5b5d0-ee84-4715-8024-25698133af6b/pull/0.log" Dec 06 00:23:38 crc kubenswrapper[4734]: I1206 00:23:38.047372 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x_d6a5b5d0-ee84-4715-8024-25698133af6b/util/0.log" Dec 06 00:23:38 crc kubenswrapper[4734]: I1206 00:23:38.071833 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x_d6a5b5d0-ee84-4715-8024-25698133af6b/pull/0.log" Dec 06 00:23:38 crc kubenswrapper[4734]: I1206 00:23:38.093297 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x_d6a5b5d0-ee84-4715-8024-25698133af6b/extract/0.log" Dec 06 00:23:38 crc kubenswrapper[4734]: I1206 00:23:38.250864 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-bl5vb_4ac00d0e-d1c1-44d8-869d-1d98f5a137e0/kube-rbac-proxy/0.log" Dec 06 00:23:38 crc kubenswrapper[4734]: I1206 00:23:38.347908 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-bl5vb_4ac00d0e-d1c1-44d8-869d-1d98f5a137e0/manager/0.log" Dec 06 00:23:38 crc kubenswrapper[4734]: I1206 00:23:38.615004 4734 scope.go:117] "RemoveContainer" containerID="30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50" Dec 06 00:23:38 crc kubenswrapper[4734]: E1206 00:23:38.616880 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:23:38 crc kubenswrapper[4734]: I1206 00:23:38.637417 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-4clqb_6ba0bb79-4132-4bd9-a2ce-c8a9b516402d/kube-rbac-proxy/0.log" Dec 06 00:23:38 crc kubenswrapper[4734]: I1206 00:23:38.825832 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-4clqb_6ba0bb79-4132-4bd9-a2ce-c8a9b516402d/manager/0.log" Dec 06 00:23:38 crc kubenswrapper[4734]: I1206 00:23:38.868248 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-fwbcd_157817be-876f-4157-87af-6ef317b91cb9/manager/0.log" Dec 06 00:23:38 crc kubenswrapper[4734]: I1206 00:23:38.880741 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-fwbcd_157817be-876f-4157-87af-6ef317b91cb9/kube-rbac-proxy/0.log" Dec 06 00:23:39 crc kubenswrapper[4734]: I1206 00:23:39.098181 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-rp9j7_c12a23f4-fdd7-455e-b74c-f757f15990ca/kube-rbac-proxy/0.log" Dec 06 00:23:39 crc kubenswrapper[4734]: I1206 00:23:39.157826 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-rp9j7_c12a23f4-fdd7-455e-b74c-f757f15990ca/manager/0.log" Dec 06 00:23:39 crc kubenswrapper[4734]: I1206 00:23:39.285839 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-c7c94_4685a9c2-ef1c-462d-848c-fbbea6a8ebfe/kube-rbac-proxy/0.log" Dec 06 00:23:39 crc kubenswrapper[4734]: I1206 00:23:39.349697 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-c7c94_4685a9c2-ef1c-462d-848c-fbbea6a8ebfe/manager/0.log" Dec 06 00:23:39 crc kubenswrapper[4734]: I1206 00:23:39.411396 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-mnmlv_9a792918-0311-4b1b-8920-a315370ecba7/kube-rbac-proxy/0.log" Dec 06 00:23:39 crc kubenswrapper[4734]: I1206 00:23:39.536362 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-mnmlv_9a792918-0311-4b1b-8920-a315370ecba7/manager/0.log" Dec 06 00:23:39 crc kubenswrapper[4734]: I1206 00:23:39.675629 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-r2427_df5aaec7-4487-47a1-98c4-0206d0ecf7f4/kube-rbac-proxy/0.log" Dec 06 00:23:39 crc kubenswrapper[4734]: I1206 00:23:39.913460 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-rw8vg_ef794353-3292-4809-94d8-105aaa36889e/kube-rbac-proxy/0.log" Dec 06 00:23:39 crc kubenswrapper[4734]: I1206 00:23:39.997470 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-r2427_df5aaec7-4487-47a1-98c4-0206d0ecf7f4/manager/0.log" Dec 06 00:23:40 crc kubenswrapper[4734]: I1206 00:23:40.023891 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-rw8vg_ef794353-3292-4809-94d8-105aaa36889e/manager/0.log" Dec 06 00:23:40 crc kubenswrapper[4734]: I1206 00:23:40.184683 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-t5gsd_3255ef71-c5a8-4fef-a1ab-dc2107c710eb/kube-rbac-proxy/0.log" Dec 06 00:23:40 crc kubenswrapper[4734]: I1206 00:23:40.295440 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-t5gsd_3255ef71-c5a8-4fef-a1ab-dc2107c710eb/manager/0.log" Dec 06 00:23:40 crc kubenswrapper[4734]: I1206 00:23:40.324698 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-lwpjm_9883e2bb-76f7-476d-8a74-e358ebf37ed2/kube-rbac-proxy/0.log" Dec 06 00:23:40 crc kubenswrapper[4734]: I1206 00:23:40.468877 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-lwpjm_9883e2bb-76f7-476d-8a74-e358ebf37ed2/manager/0.log" Dec 06 00:23:40 crc kubenswrapper[4734]: I1206 00:23:40.576368 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-nc6wd_608bca6a-1cb5-44b9-91c6-32a77372a4e5/kube-rbac-proxy/0.log" Dec 06 00:23:40 crc kubenswrapper[4734]: I1206 00:23:40.594269 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-nc6wd_608bca6a-1cb5-44b9-91c6-32a77372a4e5/manager/0.log" Dec 06 00:23:40 crc kubenswrapper[4734]: I1206 00:23:40.803271 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-zx66z_58aa2c14-9374-45b1-b6dd-07e849f23306/manager/0.log" Dec 06 00:23:40 crc kubenswrapper[4734]: I1206 00:23:40.817095 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-zx66z_58aa2c14-9374-45b1-b6dd-07e849f23306/kube-rbac-proxy/0.log" Dec 06 00:23:40 crc kubenswrapper[4734]: I1206 00:23:40.971012 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-wf4vr_aa5ccaa9-5087-4891-b255-a5135271a2a5/kube-rbac-proxy/0.log" Dec 06 00:23:41 crc kubenswrapper[4734]: I1206 00:23:41.147139 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-bf28l_3ab5c543-f1e6-455c-a051-7940ffcc833d/kube-rbac-proxy/0.log" Dec 06 00:23:41 crc kubenswrapper[4734]: I1206 00:23:41.147327 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-wf4vr_aa5ccaa9-5087-4891-b255-a5135271a2a5/manager/0.log" Dec 06 00:23:41 crc kubenswrapper[4734]: I1206 00:23:41.319383 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-bf28l_3ab5c543-f1e6-455c-a051-7940ffcc833d/manager/0.log" Dec 06 00:23:41 crc kubenswrapper[4734]: I1206 00:23:41.397851 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879fhnqmj_9cce8abe-4425-4cea-ac4f-3fd707bd5737/manager/0.log" Dec 06 00:23:41 crc kubenswrapper[4734]: I1206 00:23:41.417167 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879fhnqmj_9cce8abe-4425-4cea-ac4f-3fd707bd5737/kube-rbac-proxy/0.log" Dec 06 00:23:42 crc kubenswrapper[4734]: I1206 00:23:42.022420 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-c7wgh_99bde61f-d552-4013-b4fc-eb55e428f53b/registry-server/0.log" Dec 06 00:23:42 crc kubenswrapper[4734]: I1206 00:23:42.027856 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-58b957df85-wbwkx_9ef6c4e9-8341-489c-9f21-ffda1c3ef34a/operator/0.log" Dec 06 00:23:42 crc kubenswrapper[4734]: I1206 00:23:42.243621 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-hdt6h_ea29b614-e490-4a3e-925e-d9f6c56b0c35/kube-rbac-proxy/0.log" Dec 06 00:23:42 crc kubenswrapper[4734]: I1206 00:23:42.358834 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-w8l4b_ad6bda6e-964f-44c3-b759-ad151097b4f1/kube-rbac-proxy/0.log" Dec 06 00:23:42 crc kubenswrapper[4734]: I1206 00:23:42.408969 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-hdt6h_ea29b614-e490-4a3e-925e-d9f6c56b0c35/manager/0.log" Dec 06 00:23:42 crc kubenswrapper[4734]: I1206 00:23:42.643507 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-w8l4b_ad6bda6e-964f-44c3-b759-ad151097b4f1/manager/0.log" Dec 06 00:23:42 crc kubenswrapper[4734]: I1206 00:23:42.844747 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5845f76896-vhzwq_2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e/manager/0.log" Dec 06 00:23:42 crc kubenswrapper[4734]: I1206 00:23:42.860006 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-4bt9z_43c8ec4c-96f9-47f0-9313-2813ea1c62c2/operator/0.log" Dec 06 00:23:43 crc kubenswrapper[4734]: I1206 00:23:43.028120 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-cdtjx_2050fd66-c55a-4048-a869-cb786b5f0d2b/kube-rbac-proxy/0.log" Dec 06 00:23:43 crc kubenswrapper[4734]: I1206 00:23:43.126978 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-cdtjx_2050fd66-c55a-4048-a869-cb786b5f0d2b/manager/0.log" Dec 06 00:23:43 crc kubenswrapper[4734]: I1206 00:23:43.161821 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-2fm4z_696f07ba-7c46-41f2-826f-890756824285/kube-rbac-proxy/0.log" Dec 06 00:23:43 crc kubenswrapper[4734]: I1206 00:23:43.359335 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-j4m2j_974bff7e-6bfc-49c2-9d3d-831d1bf5385d/kube-rbac-proxy/0.log" Dec 06 00:23:43 crc kubenswrapper[4734]: I1206 00:23:43.365441 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-2fm4z_696f07ba-7c46-41f2-826f-890756824285/manager/0.log" Dec 06 00:23:43 crc kubenswrapper[4734]: I1206 00:23:43.439146 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-j4m2j_974bff7e-6bfc-49c2-9d3d-831d1bf5385d/manager/0.log" Dec 06 00:23:43 crc kubenswrapper[4734]: I1206 00:23:43.590633 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-667bd8d554-xwqfj_b7ee6df9-99e2-480d-aa84-7618ff0cda2f/manager/0.log" Dec 06 00:23:43 crc kubenswrapper[4734]: I1206 00:23:43.595095 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-667bd8d554-xwqfj_b7ee6df9-99e2-480d-aa84-7618ff0cda2f/kube-rbac-proxy/0.log" Dec 06 00:23:50 crc kubenswrapper[4734]: I1206 00:23:50.614836 4734 scope.go:117] "RemoveContainer" containerID="30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50" Dec 06 00:23:50 crc kubenswrapper[4734]: E1206 00:23:50.615898 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:24:05 crc kubenswrapper[4734]: I1206 00:24:05.614368 4734 scope.go:117] "RemoveContainer" containerID="30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50" Dec 06 00:24:05 crc kubenswrapper[4734]: E1206 00:24:05.616675 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:24:06 crc kubenswrapper[4734]: I1206 00:24:06.845063 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-qvhvd_7b7cfcc6-ee0e-4af5-9e03-5d8cbb40edbb/control-plane-machine-set-operator/0.log" Dec 06 00:24:07 crc kubenswrapper[4734]: I1206 00:24:07.093036 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-j6jsf_74a8397f-0607-4761-9fc5-77e9a6d197c8/kube-rbac-proxy/0.log" Dec 06 00:24:07 crc kubenswrapper[4734]: I1206 00:24:07.116120 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-j6jsf_74a8397f-0607-4761-9fc5-77e9a6d197c8/machine-api-operator/0.log" Dec 06 00:24:16 crc kubenswrapper[4734]: I1206 00:24:16.614801 4734 scope.go:117] "RemoveContainer" containerID="30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50" Dec 06 00:24:16 crc kubenswrapper[4734]: E1206 00:24:16.616013 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:24:21 crc kubenswrapper[4734]: I1206 00:24:21.388057 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-hnchl_77b0debe-a9d9-495d-baf3-e5ad3c05541a/cert-manager-controller/0.log" Dec 06 00:24:21 crc kubenswrapper[4734]: I1206 00:24:21.604995 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-pffbx_4aa5e323-62f7-491b-a47e-747b2d32cfc5/cert-manager-webhook/0.log" Dec 06 00:24:21 crc kubenswrapper[4734]: I1206 00:24:21.664201 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-j92m9_4806cf35-7fd8-4044-8618-8e573c476375/cert-manager-cainjector/0.log" Dec 06 00:24:28 crc kubenswrapper[4734]: I1206 00:24:28.614386 4734 scope.go:117] "RemoveContainer" containerID="30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50" Dec 06 00:24:28 crc kubenswrapper[4734]: E1206 00:24:28.615603 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:24:34 crc kubenswrapper[4734]: I1206 00:24:34.302297 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-cr2t4_238e4a30-5ad1-4948-b27f-41e096f3095a/nmstate-console-plugin/0.log" Dec 06 00:24:34 crc kubenswrapper[4734]: I1206 00:24:34.444374 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-x728q_ab1d36f1-0fc8-4ad6-8725-799c1838b033/nmstate-handler/0.log" Dec 06 00:24:34 crc kubenswrapper[4734]: I1206 00:24:34.566060 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-6p9xx_408e85a8-5bd9-4c30-bd55-5262e3a2aa24/nmstate-metrics/0.log" Dec 06 00:24:34 crc kubenswrapper[4734]: I1206 00:24:34.596081 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-6p9xx_408e85a8-5bd9-4c30-bd55-5262e3a2aa24/kube-rbac-proxy/0.log" Dec 06 00:24:34 crc kubenswrapper[4734]: I1206 00:24:34.775718 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-gxn6v_7b52ad0f-5f7e-4691-be39-ac2f121bb909/nmstate-operator/0.log" Dec 06 00:24:34 crc kubenswrapper[4734]: I1206 00:24:34.830152 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-zxqpc_6bf99a15-c582-4a10-a26f-252c1c870f55/nmstate-webhook/0.log" Dec 06 00:24:40 crc kubenswrapper[4734]: I1206 00:24:40.614397 4734 scope.go:117] "RemoveContainer" containerID="30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50" Dec 06 00:24:40 crc kubenswrapper[4734]: E1206 00:24:40.615489 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:24:51 crc kubenswrapper[4734]: I1206 00:24:51.585840 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vc887"] Dec 06 00:24:51 crc kubenswrapper[4734]: E1206 00:24:51.587256 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="558f03d8-9b38-4f41-8b98-1efcaf4242b7" containerName="container-00" Dec 06 00:24:51 crc kubenswrapper[4734]: I1206 00:24:51.587275 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="558f03d8-9b38-4f41-8b98-1efcaf4242b7" containerName="container-00" Dec 06 00:24:51 crc kubenswrapper[4734]: I1206 00:24:51.587555 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="558f03d8-9b38-4f41-8b98-1efcaf4242b7" containerName="container-00" Dec 06 00:24:51 crc kubenswrapper[4734]: I1206 00:24:51.589674 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vc887" Dec 06 00:24:51 crc kubenswrapper[4734]: I1206 00:24:51.598871 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vc887"] Dec 06 00:24:51 crc kubenswrapper[4734]: I1206 00:24:51.768964 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da3c6bc6-7183-4b38-8c28-79faceca44b3-catalog-content\") pod \"certified-operators-vc887\" (UID: \"da3c6bc6-7183-4b38-8c28-79faceca44b3\") " pod="openshift-marketplace/certified-operators-vc887" Dec 06 00:24:51 crc kubenswrapper[4734]: I1206 00:24:51.769068 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da3c6bc6-7183-4b38-8c28-79faceca44b3-utilities\") pod \"certified-operators-vc887\" (UID: \"da3c6bc6-7183-4b38-8c28-79faceca44b3\") " pod="openshift-marketplace/certified-operators-vc887" Dec 06 00:24:51 crc kubenswrapper[4734]: I1206 00:24:51.769137 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtvqq\" (UniqueName: \"kubernetes.io/projected/da3c6bc6-7183-4b38-8c28-79faceca44b3-kube-api-access-mtvqq\") pod \"certified-operators-vc887\" (UID: \"da3c6bc6-7183-4b38-8c28-79faceca44b3\") " pod="openshift-marketplace/certified-operators-vc887" Dec 06 00:24:51 crc kubenswrapper[4734]: I1206 00:24:51.871987 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtvqq\" (UniqueName: \"kubernetes.io/projected/da3c6bc6-7183-4b38-8c28-79faceca44b3-kube-api-access-mtvqq\") pod \"certified-operators-vc887\" (UID: \"da3c6bc6-7183-4b38-8c28-79faceca44b3\") " pod="openshift-marketplace/certified-operators-vc887" Dec 06 00:24:51 crc kubenswrapper[4734]: I1206 00:24:51.872284 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da3c6bc6-7183-4b38-8c28-79faceca44b3-catalog-content\") pod \"certified-operators-vc887\" (UID: \"da3c6bc6-7183-4b38-8c28-79faceca44b3\") " pod="openshift-marketplace/certified-operators-vc887" Dec 06 00:24:51 crc kubenswrapper[4734]: I1206 00:24:51.872325 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da3c6bc6-7183-4b38-8c28-79faceca44b3-utilities\") pod \"certified-operators-vc887\" (UID: \"da3c6bc6-7183-4b38-8c28-79faceca44b3\") " pod="openshift-marketplace/certified-operators-vc887" Dec 06 00:24:51 crc kubenswrapper[4734]: I1206 00:24:51.872899 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da3c6bc6-7183-4b38-8c28-79faceca44b3-catalog-content\") pod \"certified-operators-vc887\" (UID: \"da3c6bc6-7183-4b38-8c28-79faceca44b3\") " pod="openshift-marketplace/certified-operators-vc887" Dec 06 00:24:51 crc kubenswrapper[4734]: I1206 00:24:51.872928 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da3c6bc6-7183-4b38-8c28-79faceca44b3-utilities\") pod \"certified-operators-vc887\" (UID: \"da3c6bc6-7183-4b38-8c28-79faceca44b3\") " pod="openshift-marketplace/certified-operators-vc887" Dec 06 00:24:51 crc kubenswrapper[4734]: I1206 00:24:51.897606 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtvqq\" (UniqueName: \"kubernetes.io/projected/da3c6bc6-7183-4b38-8c28-79faceca44b3-kube-api-access-mtvqq\") pod \"certified-operators-vc887\" (UID: \"da3c6bc6-7183-4b38-8c28-79faceca44b3\") " pod="openshift-marketplace/certified-operators-vc887" Dec 06 00:24:51 crc kubenswrapper[4734]: I1206 00:24:51.929285 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vc887" Dec 06 00:24:52 crc kubenswrapper[4734]: I1206 00:24:52.751176 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vc887"] Dec 06 00:24:53 crc kubenswrapper[4734]: I1206 00:24:53.106238 4734 generic.go:334] "Generic (PLEG): container finished" podID="da3c6bc6-7183-4b38-8c28-79faceca44b3" containerID="56c4c575430bacb96645066d734dc30caeddcc664fcccbbbf9a6a7a9abadc184" exitCode=0 Dec 06 00:24:53 crc kubenswrapper[4734]: I1206 00:24:53.106344 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vc887" event={"ID":"da3c6bc6-7183-4b38-8c28-79faceca44b3","Type":"ContainerDied","Data":"56c4c575430bacb96645066d734dc30caeddcc664fcccbbbf9a6a7a9abadc184"} Dec 06 00:24:53 crc kubenswrapper[4734]: I1206 00:24:53.106650 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vc887" event={"ID":"da3c6bc6-7183-4b38-8c28-79faceca44b3","Type":"ContainerStarted","Data":"fb9b224b8b24141ae9f802c4e5bf67db0ab9485f7c6b8daca0d039d781334097"} Dec 06 00:24:54 crc kubenswrapper[4734]: I1206 00:24:54.614367 4734 scope.go:117] "RemoveContainer" containerID="30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50" Dec 06 00:24:54 crc kubenswrapper[4734]: E1206 00:24:54.615401 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:24:55 crc kubenswrapper[4734]: I1206 00:24:55.146107 4734 generic.go:334] "Generic (PLEG): container finished" podID="da3c6bc6-7183-4b38-8c28-79faceca44b3" containerID="5a10f4c3d08591777daa906fa7dd8294a34614a8152269dbfff00f8a5abc297f" exitCode=0 Dec 06 00:24:55 crc kubenswrapper[4734]: I1206 00:24:55.146181 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vc887" event={"ID":"da3c6bc6-7183-4b38-8c28-79faceca44b3","Type":"ContainerDied","Data":"5a10f4c3d08591777daa906fa7dd8294a34614a8152269dbfff00f8a5abc297f"} Dec 06 00:24:56 crc kubenswrapper[4734]: I1206 00:24:56.160636 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vc887" event={"ID":"da3c6bc6-7183-4b38-8c28-79faceca44b3","Type":"ContainerStarted","Data":"b1f4994e588e1ac1eaebe19a9683388308d85b5daba23f4e8bb563bddfce0385"} Dec 06 00:24:56 crc kubenswrapper[4734]: I1206 00:24:56.196295 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vc887" podStartSLOduration=2.697215475 podStartE2EDuration="5.196267033s" podCreationTimestamp="2025-12-06 00:24:51 +0000 UTC" firstStartedPulling="2025-12-06 00:24:53.10790059 +0000 UTC m=+3913.791304866" lastFinishedPulling="2025-12-06 00:24:55.606952148 +0000 UTC m=+3916.290356424" observedRunningTime="2025-12-06 00:24:56.182800642 +0000 UTC m=+3916.866204918" watchObservedRunningTime="2025-12-06 00:24:56.196267033 +0000 UTC m=+3916.879671309" Dec 06 00:24:57 crc kubenswrapper[4734]: I1206 00:24:57.321732 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-cpg9k_ac2c5d10-25e3-4d0e-9632-ee5701c15e7e/kube-rbac-proxy/0.log" Dec 06 00:24:57 crc kubenswrapper[4734]: I1206 00:24:57.376719 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-cpg9k_ac2c5d10-25e3-4d0e-9632-ee5701c15e7e/controller/0.log" Dec 06 00:24:57 crc kubenswrapper[4734]: I1206 00:24:57.549609 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/cp-frr-files/0.log" Dec 06 00:24:57 crc kubenswrapper[4734]: I1206 00:24:57.830451 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/cp-frr-files/0.log" Dec 06 00:24:57 crc kubenswrapper[4734]: I1206 00:24:57.867104 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/cp-reloader/0.log" Dec 06 00:24:57 crc kubenswrapper[4734]: I1206 00:24:57.879453 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/cp-metrics/0.log" Dec 06 00:24:57 crc kubenswrapper[4734]: I1206 00:24:57.912176 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/cp-reloader/0.log" Dec 06 00:24:58 crc kubenswrapper[4734]: I1206 00:24:58.140543 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/cp-frr-files/0.log" Dec 06 00:24:58 crc kubenswrapper[4734]: I1206 00:24:58.174391 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/cp-reloader/0.log" Dec 06 00:24:58 crc kubenswrapper[4734]: I1206 00:24:58.177077 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/cp-metrics/0.log" Dec 06 00:24:58 crc kubenswrapper[4734]: I1206 00:24:58.200864 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/cp-metrics/0.log" Dec 06 00:24:58 crc kubenswrapper[4734]: I1206 00:24:58.383485 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/cp-metrics/0.log" Dec 06 00:24:58 crc kubenswrapper[4734]: I1206 00:24:58.385501 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/cp-frr-files/0.log" Dec 06 00:24:58 crc kubenswrapper[4734]: I1206 00:24:58.425016 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/controller/0.log" Dec 06 00:24:58 crc kubenswrapper[4734]: I1206 00:24:58.434484 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/cp-reloader/0.log" Dec 06 00:24:58 crc kubenswrapper[4734]: I1206 00:24:58.634258 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/frr-metrics/0.log" Dec 06 00:24:58 crc kubenswrapper[4734]: I1206 00:24:58.667896 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/kube-rbac-proxy/0.log" Dec 06 00:24:58 crc kubenswrapper[4734]: I1206 00:24:58.703859 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/kube-rbac-proxy-frr/0.log" Dec 06 00:24:58 crc kubenswrapper[4734]: I1206 00:24:58.845654 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/reloader/0.log" Dec 06 00:24:59 crc kubenswrapper[4734]: I1206 00:24:59.003386 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-g87xk_80152489-1b48-4b06-8684-983081b45f88/frr-k8s-webhook-server/0.log" Dec 06 00:24:59 crc kubenswrapper[4734]: I1206 00:24:59.336462 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5b69785d4f-gksx6_df912953-69c4-4841-abb5-afa544bd8df7/manager/0.log" Dec 06 00:24:59 crc kubenswrapper[4734]: I1206 00:24:59.350079 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-8688474b6d-2dhr7_97e5de92-85a3-4262-a82f-5b7195d72a9c/webhook-server/0.log" Dec 06 00:25:00 crc kubenswrapper[4734]: I1206 00:25:00.004829 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-csdv2_0b07126f-ef86-48d5-b597-56782b518f5e/kube-rbac-proxy/0.log" Dec 06 00:25:00 crc kubenswrapper[4734]: I1206 00:25:00.136327 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/frr/0.log" Dec 06 00:25:00 crc kubenswrapper[4734]: I1206 00:25:00.348314 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-csdv2_0b07126f-ef86-48d5-b597-56782b518f5e/speaker/0.log" Dec 06 00:25:01 crc kubenswrapper[4734]: I1206 00:25:01.930637 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vc887" Dec 06 00:25:01 crc kubenswrapper[4734]: I1206 00:25:01.931431 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vc887" Dec 06 00:25:01 crc kubenswrapper[4734]: I1206 00:25:01.987554 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vc887" Dec 06 00:25:02 crc kubenswrapper[4734]: I1206 00:25:02.284116 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vc887" Dec 06 00:25:02 crc kubenswrapper[4734]: I1206 00:25:02.347911 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vc887"] Dec 06 00:25:04 crc kubenswrapper[4734]: I1206 00:25:04.243817 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vc887" podUID="da3c6bc6-7183-4b38-8c28-79faceca44b3" containerName="registry-server" containerID="cri-o://b1f4994e588e1ac1eaebe19a9683388308d85b5daba23f4e8bb563bddfce0385" gracePeriod=2 Dec 06 00:25:04 crc kubenswrapper[4734]: I1206 00:25:04.751899 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vc887" Dec 06 00:25:04 crc kubenswrapper[4734]: I1206 00:25:04.873472 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da3c6bc6-7183-4b38-8c28-79faceca44b3-catalog-content\") pod \"da3c6bc6-7183-4b38-8c28-79faceca44b3\" (UID: \"da3c6bc6-7183-4b38-8c28-79faceca44b3\") " Dec 06 00:25:04 crc kubenswrapper[4734]: I1206 00:25:04.873586 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da3c6bc6-7183-4b38-8c28-79faceca44b3-utilities\") pod \"da3c6bc6-7183-4b38-8c28-79faceca44b3\" (UID: \"da3c6bc6-7183-4b38-8c28-79faceca44b3\") " Dec 06 00:25:04 crc kubenswrapper[4734]: I1206 00:25:04.873887 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtvqq\" (UniqueName: \"kubernetes.io/projected/da3c6bc6-7183-4b38-8c28-79faceca44b3-kube-api-access-mtvqq\") pod \"da3c6bc6-7183-4b38-8c28-79faceca44b3\" (UID: \"da3c6bc6-7183-4b38-8c28-79faceca44b3\") " Dec 06 00:25:04 crc kubenswrapper[4734]: I1206 00:25:04.876278 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da3c6bc6-7183-4b38-8c28-79faceca44b3-utilities" (OuterVolumeSpecName: "utilities") pod "da3c6bc6-7183-4b38-8c28-79faceca44b3" (UID: "da3c6bc6-7183-4b38-8c28-79faceca44b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:25:04 crc kubenswrapper[4734]: I1206 00:25:04.882072 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da3c6bc6-7183-4b38-8c28-79faceca44b3-kube-api-access-mtvqq" (OuterVolumeSpecName: "kube-api-access-mtvqq") pod "da3c6bc6-7183-4b38-8c28-79faceca44b3" (UID: "da3c6bc6-7183-4b38-8c28-79faceca44b3"). InnerVolumeSpecName "kube-api-access-mtvqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:25:04 crc kubenswrapper[4734]: I1206 00:25:04.927497 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da3c6bc6-7183-4b38-8c28-79faceca44b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da3c6bc6-7183-4b38-8c28-79faceca44b3" (UID: "da3c6bc6-7183-4b38-8c28-79faceca44b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:25:04 crc kubenswrapper[4734]: I1206 00:25:04.977219 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtvqq\" (UniqueName: \"kubernetes.io/projected/da3c6bc6-7183-4b38-8c28-79faceca44b3-kube-api-access-mtvqq\") on node \"crc\" DevicePath \"\"" Dec 06 00:25:04 crc kubenswrapper[4734]: I1206 00:25:04.977260 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da3c6bc6-7183-4b38-8c28-79faceca44b3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 00:25:04 crc kubenswrapper[4734]: I1206 00:25:04.977270 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da3c6bc6-7183-4b38-8c28-79faceca44b3-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 00:25:05 crc kubenswrapper[4734]: I1206 00:25:05.256106 4734 generic.go:334] "Generic (PLEG): container finished" podID="da3c6bc6-7183-4b38-8c28-79faceca44b3" containerID="b1f4994e588e1ac1eaebe19a9683388308d85b5daba23f4e8bb563bddfce0385" exitCode=0 Dec 06 00:25:05 crc kubenswrapper[4734]: I1206 00:25:05.256159 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vc887" event={"ID":"da3c6bc6-7183-4b38-8c28-79faceca44b3","Type":"ContainerDied","Data":"b1f4994e588e1ac1eaebe19a9683388308d85b5daba23f4e8bb563bddfce0385"} Dec 06 00:25:05 crc kubenswrapper[4734]: I1206 00:25:05.256197 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vc887" event={"ID":"da3c6bc6-7183-4b38-8c28-79faceca44b3","Type":"ContainerDied","Data":"fb9b224b8b24141ae9f802c4e5bf67db0ab9485f7c6b8daca0d039d781334097"} Dec 06 00:25:05 crc kubenswrapper[4734]: I1206 00:25:05.256218 4734 scope.go:117] "RemoveContainer" containerID="b1f4994e588e1ac1eaebe19a9683388308d85b5daba23f4e8bb563bddfce0385" Dec 06 00:25:05 crc kubenswrapper[4734]: I1206 00:25:05.256380 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vc887" Dec 06 00:25:05 crc kubenswrapper[4734]: I1206 00:25:05.300515 4734 scope.go:117] "RemoveContainer" containerID="5a10f4c3d08591777daa906fa7dd8294a34614a8152269dbfff00f8a5abc297f" Dec 06 00:25:05 crc kubenswrapper[4734]: I1206 00:25:05.303078 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vc887"] Dec 06 00:25:05 crc kubenswrapper[4734]: I1206 00:25:05.329398 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vc887"] Dec 06 00:25:05 crc kubenswrapper[4734]: I1206 00:25:05.341686 4734 scope.go:117] "RemoveContainer" containerID="56c4c575430bacb96645066d734dc30caeddcc664fcccbbbf9a6a7a9abadc184" Dec 06 00:25:05 crc kubenswrapper[4734]: I1206 00:25:05.383694 4734 scope.go:117] "RemoveContainer" containerID="b1f4994e588e1ac1eaebe19a9683388308d85b5daba23f4e8bb563bddfce0385" Dec 06 00:25:05 crc kubenswrapper[4734]: E1206 00:25:05.384346 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1f4994e588e1ac1eaebe19a9683388308d85b5daba23f4e8bb563bddfce0385\": container with ID starting with b1f4994e588e1ac1eaebe19a9683388308d85b5daba23f4e8bb563bddfce0385 not found: ID does not exist" containerID="b1f4994e588e1ac1eaebe19a9683388308d85b5daba23f4e8bb563bddfce0385" Dec 06 00:25:05 crc kubenswrapper[4734]: I1206 00:25:05.384412 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1f4994e588e1ac1eaebe19a9683388308d85b5daba23f4e8bb563bddfce0385"} err="failed to get container status \"b1f4994e588e1ac1eaebe19a9683388308d85b5daba23f4e8bb563bddfce0385\": rpc error: code = NotFound desc = could not find container \"b1f4994e588e1ac1eaebe19a9683388308d85b5daba23f4e8bb563bddfce0385\": container with ID starting with b1f4994e588e1ac1eaebe19a9683388308d85b5daba23f4e8bb563bddfce0385 not found: ID does not exist" Dec 06 00:25:05 crc kubenswrapper[4734]: I1206 00:25:05.384459 4734 scope.go:117] "RemoveContainer" containerID="5a10f4c3d08591777daa906fa7dd8294a34614a8152269dbfff00f8a5abc297f" Dec 06 00:25:05 crc kubenswrapper[4734]: E1206 00:25:05.385085 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a10f4c3d08591777daa906fa7dd8294a34614a8152269dbfff00f8a5abc297f\": container with ID starting with 5a10f4c3d08591777daa906fa7dd8294a34614a8152269dbfff00f8a5abc297f not found: ID does not exist" containerID="5a10f4c3d08591777daa906fa7dd8294a34614a8152269dbfff00f8a5abc297f" Dec 06 00:25:05 crc kubenswrapper[4734]: I1206 00:25:05.385128 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a10f4c3d08591777daa906fa7dd8294a34614a8152269dbfff00f8a5abc297f"} err="failed to get container status \"5a10f4c3d08591777daa906fa7dd8294a34614a8152269dbfff00f8a5abc297f\": rpc error: code = NotFound desc = could not find container \"5a10f4c3d08591777daa906fa7dd8294a34614a8152269dbfff00f8a5abc297f\": container with ID starting with 5a10f4c3d08591777daa906fa7dd8294a34614a8152269dbfff00f8a5abc297f not found: ID does not exist" Dec 06 00:25:05 crc kubenswrapper[4734]: I1206 00:25:05.385160 4734 scope.go:117] "RemoveContainer" containerID="56c4c575430bacb96645066d734dc30caeddcc664fcccbbbf9a6a7a9abadc184" Dec 06 00:25:05 crc kubenswrapper[4734]: E1206 00:25:05.385556 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56c4c575430bacb96645066d734dc30caeddcc664fcccbbbf9a6a7a9abadc184\": container with ID starting with 56c4c575430bacb96645066d734dc30caeddcc664fcccbbbf9a6a7a9abadc184 not found: ID does not exist" containerID="56c4c575430bacb96645066d734dc30caeddcc664fcccbbbf9a6a7a9abadc184" Dec 06 00:25:05 crc kubenswrapper[4734]: I1206 00:25:05.385692 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56c4c575430bacb96645066d734dc30caeddcc664fcccbbbf9a6a7a9abadc184"} err="failed to get container status \"56c4c575430bacb96645066d734dc30caeddcc664fcccbbbf9a6a7a9abadc184\": rpc error: code = NotFound desc = could not find container \"56c4c575430bacb96645066d734dc30caeddcc664fcccbbbf9a6a7a9abadc184\": container with ID starting with 56c4c575430bacb96645066d734dc30caeddcc664fcccbbbf9a6a7a9abadc184 not found: ID does not exist" Dec 06 00:25:05 crc kubenswrapper[4734]: I1206 00:25:05.618439 4734 scope.go:117] "RemoveContainer" containerID="30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50" Dec 06 00:25:05 crc kubenswrapper[4734]: E1206 00:25:05.619191 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:25:05 crc kubenswrapper[4734]: I1206 00:25:05.634167 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da3c6bc6-7183-4b38-8c28-79faceca44b3" path="/var/lib/kubelet/pods/da3c6bc6-7183-4b38-8c28-79faceca44b3/volumes" Dec 06 00:25:13 crc kubenswrapper[4734]: I1206 00:25:13.809139 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5_44479d90-68b0-4428-b667-5c5c8bbebf2e/util/0.log" Dec 06 00:25:14 crc kubenswrapper[4734]: I1206 00:25:14.009463 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5_44479d90-68b0-4428-b667-5c5c8bbebf2e/util/0.log" Dec 06 00:25:14 crc kubenswrapper[4734]: I1206 00:25:14.039162 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5_44479d90-68b0-4428-b667-5c5c8bbebf2e/pull/0.log" Dec 06 00:25:14 crc kubenswrapper[4734]: I1206 00:25:14.050010 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5_44479d90-68b0-4428-b667-5c5c8bbebf2e/pull/0.log" Dec 06 00:25:14 crc kubenswrapper[4734]: I1206 00:25:14.276832 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5_44479d90-68b0-4428-b667-5c5c8bbebf2e/util/0.log" Dec 06 00:25:14 crc kubenswrapper[4734]: I1206 00:25:14.289032 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5_44479d90-68b0-4428-b667-5c5c8bbebf2e/pull/0.log" Dec 06 00:25:14 crc kubenswrapper[4734]: I1206 00:25:14.371629 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5_44479d90-68b0-4428-b667-5c5c8bbebf2e/extract/0.log" Dec 06 00:25:14 crc kubenswrapper[4734]: I1206 00:25:14.471001 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6_13194382-29bc-40a1-8f25-9566b13ad6ae/util/0.log" Dec 06 00:25:14 crc kubenswrapper[4734]: I1206 00:25:14.678723 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6_13194382-29bc-40a1-8f25-9566b13ad6ae/util/0.log" Dec 06 00:25:14 crc kubenswrapper[4734]: I1206 00:25:14.683167 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6_13194382-29bc-40a1-8f25-9566b13ad6ae/pull/0.log" Dec 06 00:25:14 crc kubenswrapper[4734]: I1206 00:25:14.691321 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6_13194382-29bc-40a1-8f25-9566b13ad6ae/pull/0.log" Dec 06 00:25:14 crc kubenswrapper[4734]: I1206 00:25:14.865211 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6_13194382-29bc-40a1-8f25-9566b13ad6ae/pull/0.log" Dec 06 00:25:14 crc kubenswrapper[4734]: I1206 00:25:14.867745 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6_13194382-29bc-40a1-8f25-9566b13ad6ae/util/0.log" Dec 06 00:25:14 crc kubenswrapper[4734]: I1206 00:25:14.905153 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6_13194382-29bc-40a1-8f25-9566b13ad6ae/extract/0.log" Dec 06 00:25:15 crc kubenswrapper[4734]: I1206 00:25:15.107807 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zvcrv_03642c63-70ad-48c4-9fa1-c0e8d2d0d067/extract-utilities/0.log" Dec 06 00:25:15 crc kubenswrapper[4734]: I1206 00:25:15.281017 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zvcrv_03642c63-70ad-48c4-9fa1-c0e8d2d0d067/extract-content/0.log" Dec 06 00:25:15 crc kubenswrapper[4734]: I1206 00:25:15.283166 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zvcrv_03642c63-70ad-48c4-9fa1-c0e8d2d0d067/extract-utilities/0.log" Dec 06 00:25:15 crc kubenswrapper[4734]: I1206 00:25:15.312824 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zvcrv_03642c63-70ad-48c4-9fa1-c0e8d2d0d067/extract-content/0.log" Dec 06 00:25:15 crc kubenswrapper[4734]: I1206 00:25:15.470890 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zvcrv_03642c63-70ad-48c4-9fa1-c0e8d2d0d067/extract-utilities/0.log" Dec 06 00:25:15 crc kubenswrapper[4734]: I1206 00:25:15.509268 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zvcrv_03642c63-70ad-48c4-9fa1-c0e8d2d0d067/extract-content/0.log" Dec 06 00:25:15 crc kubenswrapper[4734]: I1206 00:25:15.755366 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rxk7p_1d3776cd-4682-4c8b-94e2-73bc8c1ee60e/extract-utilities/0.log" Dec 06 00:25:15 crc kubenswrapper[4734]: I1206 00:25:15.980075 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rxk7p_1d3776cd-4682-4c8b-94e2-73bc8c1ee60e/extract-content/0.log" Dec 06 00:25:16 crc kubenswrapper[4734]: I1206 00:25:16.035647 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rxk7p_1d3776cd-4682-4c8b-94e2-73bc8c1ee60e/extract-utilities/0.log" Dec 06 00:25:16 crc kubenswrapper[4734]: I1206 00:25:16.050721 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rxk7p_1d3776cd-4682-4c8b-94e2-73bc8c1ee60e/extract-content/0.log" Dec 06 00:25:16 crc kubenswrapper[4734]: I1206 00:25:16.121232 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zvcrv_03642c63-70ad-48c4-9fa1-c0e8d2d0d067/registry-server/0.log" Dec 06 00:25:16 crc kubenswrapper[4734]: I1206 00:25:16.224334 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rxk7p_1d3776cd-4682-4c8b-94e2-73bc8c1ee60e/extract-content/0.log" Dec 06 00:25:16 crc kubenswrapper[4734]: I1206 00:25:16.258490 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rxk7p_1d3776cd-4682-4c8b-94e2-73bc8c1ee60e/extract-utilities/0.log" Dec 06 00:25:16 crc kubenswrapper[4734]: I1206 00:25:16.547269 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-zddtm_8cccd0a8-35c0-4e22-b73c-bc9282c804b6/marketplace-operator/0.log" Dec 06 00:25:16 crc kubenswrapper[4734]: I1206 00:25:16.695983 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fxgbf_52d0416a-4b26-4c76-8296-f279ad8c4158/extract-utilities/0.log" Dec 06 00:25:16 crc kubenswrapper[4734]: I1206 00:25:16.786643 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rxk7p_1d3776cd-4682-4c8b-94e2-73bc8c1ee60e/registry-server/0.log" Dec 06 00:25:16 crc kubenswrapper[4734]: I1206 00:25:16.977313 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fxgbf_52d0416a-4b26-4c76-8296-f279ad8c4158/extract-content/0.log" Dec 06 00:25:16 crc kubenswrapper[4734]: I1206 00:25:16.985763 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fxgbf_52d0416a-4b26-4c76-8296-f279ad8c4158/extract-utilities/0.log" Dec 06 00:25:16 crc kubenswrapper[4734]: I1206 00:25:16.985763 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fxgbf_52d0416a-4b26-4c76-8296-f279ad8c4158/extract-content/0.log" Dec 06 00:25:17 crc kubenswrapper[4734]: I1206 00:25:17.246446 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fxgbf_52d0416a-4b26-4c76-8296-f279ad8c4158/extract-content/0.log" Dec 06 00:25:17 crc kubenswrapper[4734]: I1206 00:25:17.247144 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fxgbf_52d0416a-4b26-4c76-8296-f279ad8c4158/extract-utilities/0.log" Dec 06 00:25:17 crc kubenswrapper[4734]: I1206 00:25:17.358701 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fxgbf_52d0416a-4b26-4c76-8296-f279ad8c4158/registry-server/0.log" Dec 06 00:25:17 crc kubenswrapper[4734]: I1206 00:25:17.489078 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bspnj_9e82acab-ae84-48a7-83bd-7f83a96e3f7f/extract-utilities/0.log" Dec 06 00:25:17 crc kubenswrapper[4734]: I1206 00:25:17.614937 4734 scope.go:117] "RemoveContainer" containerID="30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50" Dec 06 00:25:17 crc kubenswrapper[4734]: E1206 00:25:17.615491 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:25:17 crc kubenswrapper[4734]: I1206 00:25:17.661213 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bspnj_9e82acab-ae84-48a7-83bd-7f83a96e3f7f/extract-content/0.log" Dec 06 00:25:17 crc kubenswrapper[4734]: I1206 00:25:17.717731 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bspnj_9e82acab-ae84-48a7-83bd-7f83a96e3f7f/extract-content/0.log" Dec 06 00:25:17 crc kubenswrapper[4734]: I1206 00:25:17.718486 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bspnj_9e82acab-ae84-48a7-83bd-7f83a96e3f7f/extract-utilities/0.log" Dec 06 00:25:17 crc kubenswrapper[4734]: I1206 00:25:17.889883 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bspnj_9e82acab-ae84-48a7-83bd-7f83a96e3f7f/extract-utilities/0.log" Dec 06 00:25:17 crc kubenswrapper[4734]: I1206 00:25:17.890496 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bspnj_9e82acab-ae84-48a7-83bd-7f83a96e3f7f/extract-content/0.log" Dec 06 00:25:18 crc kubenswrapper[4734]: I1206 00:25:18.542613 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bspnj_9e82acab-ae84-48a7-83bd-7f83a96e3f7f/registry-server/0.log" Dec 06 00:25:32 crc kubenswrapper[4734]: I1206 00:25:32.615142 4734 scope.go:117] "RemoveContainer" containerID="30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50" Dec 06 00:25:32 crc kubenswrapper[4734]: E1206 00:25:32.616326 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:25:45 crc kubenswrapper[4734]: I1206 00:25:45.614826 4734 scope.go:117] "RemoveContainer" containerID="30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50" Dec 06 00:25:45 crc kubenswrapper[4734]: E1206 00:25:45.616153 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:25:51 crc kubenswrapper[4734]: E1206 00:25:51.061800 4734 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.38:37954->38.102.83.38:44725: write tcp 38.102.83.38:37954->38.102.83.38:44725: write: broken pipe Dec 06 00:25:57 crc kubenswrapper[4734]: I1206 00:25:57.614278 4734 scope.go:117] "RemoveContainer" containerID="30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50" Dec 06 00:25:57 crc kubenswrapper[4734]: I1206 00:25:57.937712 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerStarted","Data":"81278da5539b6ec2789a9334b6c891b6b36cb63a7e5b4a031f5c2f40b60a134e"} Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.067707 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kfj5q"] Dec 06 00:27:13 crc kubenswrapper[4734]: E1206 00:27:13.069214 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da3c6bc6-7183-4b38-8c28-79faceca44b3" containerName="extract-utilities" Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.069236 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="da3c6bc6-7183-4b38-8c28-79faceca44b3" containerName="extract-utilities" Dec 06 00:27:13 crc kubenswrapper[4734]: E1206 00:27:13.069263 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da3c6bc6-7183-4b38-8c28-79faceca44b3" containerName="registry-server" Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.069269 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="da3c6bc6-7183-4b38-8c28-79faceca44b3" containerName="registry-server" Dec 06 00:27:13 crc kubenswrapper[4734]: E1206 00:27:13.069292 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da3c6bc6-7183-4b38-8c28-79faceca44b3" containerName="extract-content" Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.069299 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="da3c6bc6-7183-4b38-8c28-79faceca44b3" containerName="extract-content" Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.069584 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="da3c6bc6-7183-4b38-8c28-79faceca44b3" containerName="registry-server" Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.071653 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kfj5q" Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.081612 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kfj5q"] Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.149718 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd618706-8480-4795-936f-5a5524604fd7-utilities\") pod \"redhat-operators-kfj5q\" (UID: \"dd618706-8480-4795-936f-5a5524604fd7\") " pod="openshift-marketplace/redhat-operators-kfj5q" Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.149917 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lb7x\" (UniqueName: \"kubernetes.io/projected/dd618706-8480-4795-936f-5a5524604fd7-kube-api-access-9lb7x\") pod \"redhat-operators-kfj5q\" (UID: \"dd618706-8480-4795-936f-5a5524604fd7\") " pod="openshift-marketplace/redhat-operators-kfj5q" Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.150137 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd618706-8480-4795-936f-5a5524604fd7-catalog-content\") pod \"redhat-operators-kfj5q\" (UID: \"dd618706-8480-4795-936f-5a5524604fd7\") " pod="openshift-marketplace/redhat-operators-kfj5q" Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.252891 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd618706-8480-4795-936f-5a5524604fd7-catalog-content\") pod \"redhat-operators-kfj5q\" (UID: \"dd618706-8480-4795-936f-5a5524604fd7\") " pod="openshift-marketplace/redhat-operators-kfj5q" Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.253013 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd618706-8480-4795-936f-5a5524604fd7-utilities\") pod \"redhat-operators-kfj5q\" (UID: \"dd618706-8480-4795-936f-5a5524604fd7\") " pod="openshift-marketplace/redhat-operators-kfj5q" Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.253095 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lb7x\" (UniqueName: \"kubernetes.io/projected/dd618706-8480-4795-936f-5a5524604fd7-kube-api-access-9lb7x\") pod \"redhat-operators-kfj5q\" (UID: \"dd618706-8480-4795-936f-5a5524604fd7\") " pod="openshift-marketplace/redhat-operators-kfj5q" Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.253744 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd618706-8480-4795-936f-5a5524604fd7-utilities\") pod \"redhat-operators-kfj5q\" (UID: \"dd618706-8480-4795-936f-5a5524604fd7\") " pod="openshift-marketplace/redhat-operators-kfj5q" Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.254127 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd618706-8480-4795-936f-5a5524604fd7-catalog-content\") pod \"redhat-operators-kfj5q\" (UID: \"dd618706-8480-4795-936f-5a5524604fd7\") " pod="openshift-marketplace/redhat-operators-kfj5q" Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.259948 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lnr4g"] Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.262313 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lnr4g" Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.281626 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lb7x\" (UniqueName: \"kubernetes.io/projected/dd618706-8480-4795-936f-5a5524604fd7-kube-api-access-9lb7x\") pod \"redhat-operators-kfj5q\" (UID: \"dd618706-8480-4795-936f-5a5524604fd7\") " pod="openshift-marketplace/redhat-operators-kfj5q" Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.282675 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lnr4g"] Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.355904 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3a3ccd9-2bd4-40d8-817f-d7444a11610e-catalog-content\") pod \"redhat-marketplace-lnr4g\" (UID: \"d3a3ccd9-2bd4-40d8-817f-d7444a11610e\") " pod="openshift-marketplace/redhat-marketplace-lnr4g" Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.356094 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3a3ccd9-2bd4-40d8-817f-d7444a11610e-utilities\") pod \"redhat-marketplace-lnr4g\" (UID: \"d3a3ccd9-2bd4-40d8-817f-d7444a11610e\") " pod="openshift-marketplace/redhat-marketplace-lnr4g" Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.356141 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52nhs\" (UniqueName: \"kubernetes.io/projected/d3a3ccd9-2bd4-40d8-817f-d7444a11610e-kube-api-access-52nhs\") pod \"redhat-marketplace-lnr4g\" (UID: \"d3a3ccd9-2bd4-40d8-817f-d7444a11610e\") " pod="openshift-marketplace/redhat-marketplace-lnr4g" Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.399941 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kfj5q" Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.458688 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3a3ccd9-2bd4-40d8-817f-d7444a11610e-catalog-content\") pod \"redhat-marketplace-lnr4g\" (UID: \"d3a3ccd9-2bd4-40d8-817f-d7444a11610e\") " pod="openshift-marketplace/redhat-marketplace-lnr4g" Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.458875 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3a3ccd9-2bd4-40d8-817f-d7444a11610e-utilities\") pod \"redhat-marketplace-lnr4g\" (UID: \"d3a3ccd9-2bd4-40d8-817f-d7444a11610e\") " pod="openshift-marketplace/redhat-marketplace-lnr4g" Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.458947 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52nhs\" (UniqueName: \"kubernetes.io/projected/d3a3ccd9-2bd4-40d8-817f-d7444a11610e-kube-api-access-52nhs\") pod \"redhat-marketplace-lnr4g\" (UID: \"d3a3ccd9-2bd4-40d8-817f-d7444a11610e\") " pod="openshift-marketplace/redhat-marketplace-lnr4g" Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.460967 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3a3ccd9-2bd4-40d8-817f-d7444a11610e-catalog-content\") pod \"redhat-marketplace-lnr4g\" (UID: \"d3a3ccd9-2bd4-40d8-817f-d7444a11610e\") " pod="openshift-marketplace/redhat-marketplace-lnr4g" Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.461267 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3a3ccd9-2bd4-40d8-817f-d7444a11610e-utilities\") pod \"redhat-marketplace-lnr4g\" (UID: \"d3a3ccd9-2bd4-40d8-817f-d7444a11610e\") " pod="openshift-marketplace/redhat-marketplace-lnr4g" Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.483357 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52nhs\" (UniqueName: \"kubernetes.io/projected/d3a3ccd9-2bd4-40d8-817f-d7444a11610e-kube-api-access-52nhs\") pod \"redhat-marketplace-lnr4g\" (UID: \"d3a3ccd9-2bd4-40d8-817f-d7444a11610e\") " pod="openshift-marketplace/redhat-marketplace-lnr4g" Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.647304 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lnr4g" Dec 06 00:27:13 crc kubenswrapper[4734]: I1206 00:27:13.811570 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kfj5q"] Dec 06 00:27:14 crc kubenswrapper[4734]: W1206 00:27:14.241111 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3a3ccd9_2bd4_40d8_817f_d7444a11610e.slice/crio-8f95863b0d19da3d7843c021c5a32dada0b3b20352bc5287d1b35bfd1eb51d03 WatchSource:0}: Error finding container 8f95863b0d19da3d7843c021c5a32dada0b3b20352bc5287d1b35bfd1eb51d03: Status 404 returned error can't find the container with id 8f95863b0d19da3d7843c021c5a32dada0b3b20352bc5287d1b35bfd1eb51d03 Dec 06 00:27:14 crc kubenswrapper[4734]: I1206 00:27:14.244674 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lnr4g"] Dec 06 00:27:14 crc kubenswrapper[4734]: I1206 00:27:14.763768 4734 generic.go:334] "Generic (PLEG): container finished" podID="dd618706-8480-4795-936f-5a5524604fd7" containerID="8a2379b40e47983e11f6b4d549b0f1fba9cdf9e679e0dd2f3e84ccde11a076fa" exitCode=0 Dec 06 00:27:14 crc kubenswrapper[4734]: I1206 00:27:14.763894 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kfj5q" event={"ID":"dd618706-8480-4795-936f-5a5524604fd7","Type":"ContainerDied","Data":"8a2379b40e47983e11f6b4d549b0f1fba9cdf9e679e0dd2f3e84ccde11a076fa"} Dec 06 00:27:14 crc kubenswrapper[4734]: I1206 00:27:14.764253 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kfj5q" event={"ID":"dd618706-8480-4795-936f-5a5524604fd7","Type":"ContainerStarted","Data":"1550b1532590f8b6ea723eb5f4d9386a8a48496eed35e84710e7b6e3867beda8"} Dec 06 00:27:14 crc kubenswrapper[4734]: I1206 00:27:14.766242 4734 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 00:27:14 crc kubenswrapper[4734]: I1206 00:27:14.766640 4734 generic.go:334] "Generic (PLEG): container finished" podID="d3a3ccd9-2bd4-40d8-817f-d7444a11610e" containerID="19d21d310e7beea834130c2bd37855b1f80f8b3f8e50924aef7d431bda7c613a" exitCode=0 Dec 06 00:27:14 crc kubenswrapper[4734]: I1206 00:27:14.766771 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnr4g" event={"ID":"d3a3ccd9-2bd4-40d8-817f-d7444a11610e","Type":"ContainerDied","Data":"19d21d310e7beea834130c2bd37855b1f80f8b3f8e50924aef7d431bda7c613a"} Dec 06 00:27:14 crc kubenswrapper[4734]: I1206 00:27:14.766861 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnr4g" event={"ID":"d3a3ccd9-2bd4-40d8-817f-d7444a11610e","Type":"ContainerStarted","Data":"8f95863b0d19da3d7843c021c5a32dada0b3b20352bc5287d1b35bfd1eb51d03"} Dec 06 00:27:15 crc kubenswrapper[4734]: I1206 00:27:15.783223 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kfj5q" event={"ID":"dd618706-8480-4795-936f-5a5524604fd7","Type":"ContainerStarted","Data":"5717f74437ab5d58e90e07278f1ea76faaade2307ba542148ab62540ce5745ec"} Dec 06 00:27:15 crc kubenswrapper[4734]: I1206 00:27:15.791195 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnr4g" event={"ID":"d3a3ccd9-2bd4-40d8-817f-d7444a11610e","Type":"ContainerStarted","Data":"db81d186666d959eb9be14b7ac680382ea51f46243e7886403ceb4e29e694041"} Dec 06 00:27:16 crc kubenswrapper[4734]: I1206 00:27:16.804241 4734 generic.go:334] "Generic (PLEG): container finished" podID="d3a3ccd9-2bd4-40d8-817f-d7444a11610e" containerID="db81d186666d959eb9be14b7ac680382ea51f46243e7886403ceb4e29e694041" exitCode=0 Dec 06 00:27:16 crc kubenswrapper[4734]: I1206 00:27:16.804328 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnr4g" event={"ID":"d3a3ccd9-2bd4-40d8-817f-d7444a11610e","Type":"ContainerDied","Data":"db81d186666d959eb9be14b7ac680382ea51f46243e7886403ceb4e29e694041"} Dec 06 00:27:16 crc kubenswrapper[4734]: I1206 00:27:16.810791 4734 generic.go:334] "Generic (PLEG): container finished" podID="dd618706-8480-4795-936f-5a5524604fd7" containerID="5717f74437ab5d58e90e07278f1ea76faaade2307ba542148ab62540ce5745ec" exitCode=0 Dec 06 00:27:16 crc kubenswrapper[4734]: I1206 00:27:16.810858 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kfj5q" event={"ID":"dd618706-8480-4795-936f-5a5524604fd7","Type":"ContainerDied","Data":"5717f74437ab5d58e90e07278f1ea76faaade2307ba542148ab62540ce5745ec"} Dec 06 00:27:17 crc kubenswrapper[4734]: I1206 00:27:17.858614 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kfj5q" event={"ID":"dd618706-8480-4795-936f-5a5524604fd7","Type":"ContainerStarted","Data":"4d0102afb4d7507d55822404a4c1ad76ed5086668629c26cfd509cf8a1396deb"} Dec 06 00:27:17 crc kubenswrapper[4734]: I1206 00:27:17.865331 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnr4g" event={"ID":"d3a3ccd9-2bd4-40d8-817f-d7444a11610e","Type":"ContainerStarted","Data":"576a58481e539397c7537bec1b36b74594e9276783c762deb721a9f4a4b6dd59"} Dec 06 00:27:17 crc kubenswrapper[4734]: I1206 00:27:17.878373 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kfj5q" podStartSLOduration=2.401369104 podStartE2EDuration="4.87835309s" podCreationTimestamp="2025-12-06 00:27:13 +0000 UTC" firstStartedPulling="2025-12-06 00:27:14.765917488 +0000 UTC m=+4055.449321764" lastFinishedPulling="2025-12-06 00:27:17.242901484 +0000 UTC m=+4057.926305750" observedRunningTime="2025-12-06 00:27:17.877871999 +0000 UTC m=+4058.561276275" watchObservedRunningTime="2025-12-06 00:27:17.87835309 +0000 UTC m=+4058.561757366" Dec 06 00:27:17 crc kubenswrapper[4734]: I1206 00:27:17.898711 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lnr4g" podStartSLOduration=2.455158585 podStartE2EDuration="4.89868538s" podCreationTimestamp="2025-12-06 00:27:13 +0000 UTC" firstStartedPulling="2025-12-06 00:27:14.769475256 +0000 UTC m=+4055.452879532" lastFinishedPulling="2025-12-06 00:27:17.213002051 +0000 UTC m=+4057.896406327" observedRunningTime="2025-12-06 00:27:17.896385353 +0000 UTC m=+4058.579789629" watchObservedRunningTime="2025-12-06 00:27:17.89868538 +0000 UTC m=+4058.582089656" Dec 06 00:27:22 crc kubenswrapper[4734]: I1206 00:27:22.926755 4734 generic.go:334] "Generic (PLEG): container finished" podID="8d96ce40-7782-46c2-816b-6e792c06b2f1" containerID="6e3cbdf545de6ed0d0a79997cac3743afad5ce1e25b03bb3b6f513afcf5c6fad" exitCode=0 Dec 06 00:27:22 crc kubenswrapper[4734]: I1206 00:27:22.926853 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jnf5s/must-gather-hktf7" event={"ID":"8d96ce40-7782-46c2-816b-6e792c06b2f1","Type":"ContainerDied","Data":"6e3cbdf545de6ed0d0a79997cac3743afad5ce1e25b03bb3b6f513afcf5c6fad"} Dec 06 00:27:22 crc kubenswrapper[4734]: I1206 00:27:22.928158 4734 scope.go:117] "RemoveContainer" containerID="6e3cbdf545de6ed0d0a79997cac3743afad5ce1e25b03bb3b6f513afcf5c6fad" Dec 06 00:27:23 crc kubenswrapper[4734]: I1206 00:27:23.419066 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kfj5q" Dec 06 00:27:23 crc kubenswrapper[4734]: I1206 00:27:23.422462 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kfj5q" Dec 06 00:27:23 crc kubenswrapper[4734]: I1206 00:27:23.478172 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kfj5q" Dec 06 00:27:23 crc kubenswrapper[4734]: I1206 00:27:23.648549 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lnr4g" Dec 06 00:27:23 crc kubenswrapper[4734]: I1206 00:27:23.648631 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lnr4g" Dec 06 00:27:23 crc kubenswrapper[4734]: I1206 00:27:23.697122 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jnf5s_must-gather-hktf7_8d96ce40-7782-46c2-816b-6e792c06b2f1/gather/0.log" Dec 06 00:27:23 crc kubenswrapper[4734]: I1206 00:27:23.705426 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lnr4g" Dec 06 00:27:23 crc kubenswrapper[4734]: I1206 00:27:23.989467 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kfj5q" Dec 06 00:27:24 crc kubenswrapper[4734]: I1206 00:27:24.009940 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lnr4g" Dec 06 00:27:24 crc kubenswrapper[4734]: I1206 00:27:24.649674 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kfj5q"] Dec 06 00:27:25 crc kubenswrapper[4734]: I1206 00:27:25.961072 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kfj5q" podUID="dd618706-8480-4795-936f-5a5524604fd7" containerName="registry-server" containerID="cri-o://4d0102afb4d7507d55822404a4c1ad76ed5086668629c26cfd509cf8a1396deb" gracePeriod=2 Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.048459 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lnr4g"] Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.048846 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lnr4g" podUID="d3a3ccd9-2bd4-40d8-817f-d7444a11610e" containerName="registry-server" containerID="cri-o://576a58481e539397c7537bec1b36b74594e9276783c762deb721a9f4a4b6dd59" gracePeriod=2 Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.508323 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kfj5q" Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.593169 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lnr4g" Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.597637 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd618706-8480-4795-936f-5a5524604fd7-catalog-content\") pod \"dd618706-8480-4795-936f-5a5524604fd7\" (UID: \"dd618706-8480-4795-936f-5a5524604fd7\") " Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.597777 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd618706-8480-4795-936f-5a5524604fd7-utilities\") pod \"dd618706-8480-4795-936f-5a5524604fd7\" (UID: \"dd618706-8480-4795-936f-5a5524604fd7\") " Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.597940 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lb7x\" (UniqueName: \"kubernetes.io/projected/dd618706-8480-4795-936f-5a5524604fd7-kube-api-access-9lb7x\") pod \"dd618706-8480-4795-936f-5a5524604fd7\" (UID: \"dd618706-8480-4795-936f-5a5524604fd7\") " Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.598741 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd618706-8480-4795-936f-5a5524604fd7-utilities" (OuterVolumeSpecName: "utilities") pod "dd618706-8480-4795-936f-5a5524604fd7" (UID: "dd618706-8480-4795-936f-5a5524604fd7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.605312 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd618706-8480-4795-936f-5a5524604fd7-kube-api-access-9lb7x" (OuterVolumeSpecName: "kube-api-access-9lb7x") pod "dd618706-8480-4795-936f-5a5524604fd7" (UID: "dd618706-8480-4795-936f-5a5524604fd7"). InnerVolumeSpecName "kube-api-access-9lb7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.700158 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52nhs\" (UniqueName: \"kubernetes.io/projected/d3a3ccd9-2bd4-40d8-817f-d7444a11610e-kube-api-access-52nhs\") pod \"d3a3ccd9-2bd4-40d8-817f-d7444a11610e\" (UID: \"d3a3ccd9-2bd4-40d8-817f-d7444a11610e\") " Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.700382 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3a3ccd9-2bd4-40d8-817f-d7444a11610e-catalog-content\") pod \"d3a3ccd9-2bd4-40d8-817f-d7444a11610e\" (UID: \"d3a3ccd9-2bd4-40d8-817f-d7444a11610e\") " Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.700485 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3a3ccd9-2bd4-40d8-817f-d7444a11610e-utilities\") pod \"d3a3ccd9-2bd4-40d8-817f-d7444a11610e\" (UID: \"d3a3ccd9-2bd4-40d8-817f-d7444a11610e\") " Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.701239 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lb7x\" (UniqueName: \"kubernetes.io/projected/dd618706-8480-4795-936f-5a5524604fd7-kube-api-access-9lb7x\") on node \"crc\" DevicePath \"\"" Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.701262 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd618706-8480-4795-936f-5a5524604fd7-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.704271 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3a3ccd9-2bd4-40d8-817f-d7444a11610e-kube-api-access-52nhs" (OuterVolumeSpecName: "kube-api-access-52nhs") pod "d3a3ccd9-2bd4-40d8-817f-d7444a11610e" (UID: "d3a3ccd9-2bd4-40d8-817f-d7444a11610e"). InnerVolumeSpecName "kube-api-access-52nhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.704539 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3a3ccd9-2bd4-40d8-817f-d7444a11610e-utilities" (OuterVolumeSpecName: "utilities") pod "d3a3ccd9-2bd4-40d8-817f-d7444a11610e" (UID: "d3a3ccd9-2bd4-40d8-817f-d7444a11610e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.727140 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3a3ccd9-2bd4-40d8-817f-d7444a11610e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3a3ccd9-2bd4-40d8-817f-d7444a11610e" (UID: "d3a3ccd9-2bd4-40d8-817f-d7444a11610e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.727462 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd618706-8480-4795-936f-5a5524604fd7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd618706-8480-4795-936f-5a5524604fd7" (UID: "dd618706-8480-4795-936f-5a5524604fd7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.803860 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52nhs\" (UniqueName: \"kubernetes.io/projected/d3a3ccd9-2bd4-40d8-817f-d7444a11610e-kube-api-access-52nhs\") on node \"crc\" DevicePath \"\"" Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.803921 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3a3ccd9-2bd4-40d8-817f-d7444a11610e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.803938 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd618706-8480-4795-936f-5a5524604fd7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.803947 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3a3ccd9-2bd4-40d8-817f-d7444a11610e-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.972757 4734 generic.go:334] "Generic (PLEG): container finished" podID="d3a3ccd9-2bd4-40d8-817f-d7444a11610e" containerID="576a58481e539397c7537bec1b36b74594e9276783c762deb721a9f4a4b6dd59" exitCode=0 Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.972843 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lnr4g" Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.972865 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnr4g" event={"ID":"d3a3ccd9-2bd4-40d8-817f-d7444a11610e","Type":"ContainerDied","Data":"576a58481e539397c7537bec1b36b74594e9276783c762deb721a9f4a4b6dd59"} Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.972931 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lnr4g" event={"ID":"d3a3ccd9-2bd4-40d8-817f-d7444a11610e","Type":"ContainerDied","Data":"8f95863b0d19da3d7843c021c5a32dada0b3b20352bc5287d1b35bfd1eb51d03"} Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.972959 4734 scope.go:117] "RemoveContainer" containerID="576a58481e539397c7537bec1b36b74594e9276783c762deb721a9f4a4b6dd59" Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.976052 4734 generic.go:334] "Generic (PLEG): container finished" podID="dd618706-8480-4795-936f-5a5524604fd7" containerID="4d0102afb4d7507d55822404a4c1ad76ed5086668629c26cfd509cf8a1396deb" exitCode=0 Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.976091 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kfj5q" event={"ID":"dd618706-8480-4795-936f-5a5524604fd7","Type":"ContainerDied","Data":"4d0102afb4d7507d55822404a4c1ad76ed5086668629c26cfd509cf8a1396deb"} Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.976123 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kfj5q" event={"ID":"dd618706-8480-4795-936f-5a5524604fd7","Type":"ContainerDied","Data":"1550b1532590f8b6ea723eb5f4d9386a8a48496eed35e84710e7b6e3867beda8"} Dec 06 00:27:26 crc kubenswrapper[4734]: I1206 00:27:26.976152 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kfj5q" Dec 06 00:27:27 crc kubenswrapper[4734]: I1206 00:27:27.006133 4734 scope.go:117] "RemoveContainer" containerID="db81d186666d959eb9be14b7ac680382ea51f46243e7886403ceb4e29e694041" Dec 06 00:27:27 crc kubenswrapper[4734]: I1206 00:27:27.022833 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lnr4g"] Dec 06 00:27:27 crc kubenswrapper[4734]: I1206 00:27:27.033026 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lnr4g"] Dec 06 00:27:27 crc kubenswrapper[4734]: I1206 00:27:27.044159 4734 scope.go:117] "RemoveContainer" containerID="19d21d310e7beea834130c2bd37855b1f80f8b3f8e50924aef7d431bda7c613a" Dec 06 00:27:27 crc kubenswrapper[4734]: I1206 00:27:27.050090 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kfj5q"] Dec 06 00:27:27 crc kubenswrapper[4734]: I1206 00:27:27.062016 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kfj5q"] Dec 06 00:27:27 crc kubenswrapper[4734]: I1206 00:27:27.071486 4734 scope.go:117] "RemoveContainer" containerID="576a58481e539397c7537bec1b36b74594e9276783c762deb721a9f4a4b6dd59" Dec 06 00:27:27 crc kubenswrapper[4734]: E1206 00:27:27.072188 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"576a58481e539397c7537bec1b36b74594e9276783c762deb721a9f4a4b6dd59\": container with ID starting with 576a58481e539397c7537bec1b36b74594e9276783c762deb721a9f4a4b6dd59 not found: ID does not exist" containerID="576a58481e539397c7537bec1b36b74594e9276783c762deb721a9f4a4b6dd59" Dec 06 00:27:27 crc kubenswrapper[4734]: I1206 00:27:27.072225 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"576a58481e539397c7537bec1b36b74594e9276783c762deb721a9f4a4b6dd59"} err="failed to get container status \"576a58481e539397c7537bec1b36b74594e9276783c762deb721a9f4a4b6dd59\": rpc error: code = NotFound desc = could not find container \"576a58481e539397c7537bec1b36b74594e9276783c762deb721a9f4a4b6dd59\": container with ID starting with 576a58481e539397c7537bec1b36b74594e9276783c762deb721a9f4a4b6dd59 not found: ID does not exist" Dec 06 00:27:27 crc kubenswrapper[4734]: I1206 00:27:27.072251 4734 scope.go:117] "RemoveContainer" containerID="db81d186666d959eb9be14b7ac680382ea51f46243e7886403ceb4e29e694041" Dec 06 00:27:27 crc kubenswrapper[4734]: E1206 00:27:27.074351 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db81d186666d959eb9be14b7ac680382ea51f46243e7886403ceb4e29e694041\": container with ID starting with db81d186666d959eb9be14b7ac680382ea51f46243e7886403ceb4e29e694041 not found: ID does not exist" containerID="db81d186666d959eb9be14b7ac680382ea51f46243e7886403ceb4e29e694041" Dec 06 00:27:27 crc kubenswrapper[4734]: I1206 00:27:27.074407 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db81d186666d959eb9be14b7ac680382ea51f46243e7886403ceb4e29e694041"} err="failed to get container status \"db81d186666d959eb9be14b7ac680382ea51f46243e7886403ceb4e29e694041\": rpc error: code = NotFound desc = could not find container \"db81d186666d959eb9be14b7ac680382ea51f46243e7886403ceb4e29e694041\": container with ID starting with db81d186666d959eb9be14b7ac680382ea51f46243e7886403ceb4e29e694041 not found: ID does not exist" Dec 06 00:27:27 crc kubenswrapper[4734]: I1206 00:27:27.074454 4734 scope.go:117] "RemoveContainer" containerID="19d21d310e7beea834130c2bd37855b1f80f8b3f8e50924aef7d431bda7c613a" Dec 06 00:27:27 crc kubenswrapper[4734]: E1206 00:27:27.076614 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19d21d310e7beea834130c2bd37855b1f80f8b3f8e50924aef7d431bda7c613a\": container with ID starting with 19d21d310e7beea834130c2bd37855b1f80f8b3f8e50924aef7d431bda7c613a not found: ID does not exist" containerID="19d21d310e7beea834130c2bd37855b1f80f8b3f8e50924aef7d431bda7c613a" Dec 06 00:27:27 crc kubenswrapper[4734]: I1206 00:27:27.076644 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19d21d310e7beea834130c2bd37855b1f80f8b3f8e50924aef7d431bda7c613a"} err="failed to get container status \"19d21d310e7beea834130c2bd37855b1f80f8b3f8e50924aef7d431bda7c613a\": rpc error: code = NotFound desc = could not find container \"19d21d310e7beea834130c2bd37855b1f80f8b3f8e50924aef7d431bda7c613a\": container with ID starting with 19d21d310e7beea834130c2bd37855b1f80f8b3f8e50924aef7d431bda7c613a not found: ID does not exist" Dec 06 00:27:27 crc kubenswrapper[4734]: I1206 00:27:27.076663 4734 scope.go:117] "RemoveContainer" containerID="4d0102afb4d7507d55822404a4c1ad76ed5086668629c26cfd509cf8a1396deb" Dec 06 00:27:27 crc kubenswrapper[4734]: I1206 00:27:27.102610 4734 scope.go:117] "RemoveContainer" containerID="5717f74437ab5d58e90e07278f1ea76faaade2307ba542148ab62540ce5745ec" Dec 06 00:27:27 crc kubenswrapper[4734]: I1206 00:27:27.142082 4734 scope.go:117] "RemoveContainer" containerID="8a2379b40e47983e11f6b4d549b0f1fba9cdf9e679e0dd2f3e84ccde11a076fa" Dec 06 00:27:27 crc kubenswrapper[4734]: I1206 00:27:27.170642 4734 scope.go:117] "RemoveContainer" containerID="4d0102afb4d7507d55822404a4c1ad76ed5086668629c26cfd509cf8a1396deb" Dec 06 00:27:27 crc kubenswrapper[4734]: E1206 00:27:27.171225 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d0102afb4d7507d55822404a4c1ad76ed5086668629c26cfd509cf8a1396deb\": container with ID starting with 4d0102afb4d7507d55822404a4c1ad76ed5086668629c26cfd509cf8a1396deb not found: ID does not exist" containerID="4d0102afb4d7507d55822404a4c1ad76ed5086668629c26cfd509cf8a1396deb" Dec 06 00:27:27 crc kubenswrapper[4734]: I1206 00:27:27.171275 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d0102afb4d7507d55822404a4c1ad76ed5086668629c26cfd509cf8a1396deb"} err="failed to get container status \"4d0102afb4d7507d55822404a4c1ad76ed5086668629c26cfd509cf8a1396deb\": rpc error: code = NotFound desc = could not find container \"4d0102afb4d7507d55822404a4c1ad76ed5086668629c26cfd509cf8a1396deb\": container with ID starting with 4d0102afb4d7507d55822404a4c1ad76ed5086668629c26cfd509cf8a1396deb not found: ID does not exist" Dec 06 00:27:27 crc kubenswrapper[4734]: I1206 00:27:27.171305 4734 scope.go:117] "RemoveContainer" containerID="5717f74437ab5d58e90e07278f1ea76faaade2307ba542148ab62540ce5745ec" Dec 06 00:27:27 crc kubenswrapper[4734]: E1206 00:27:27.171705 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5717f74437ab5d58e90e07278f1ea76faaade2307ba542148ab62540ce5745ec\": container with ID starting with 5717f74437ab5d58e90e07278f1ea76faaade2307ba542148ab62540ce5745ec not found: ID does not exist" containerID="5717f74437ab5d58e90e07278f1ea76faaade2307ba542148ab62540ce5745ec" Dec 06 00:27:27 crc kubenswrapper[4734]: I1206 00:27:27.171788 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5717f74437ab5d58e90e07278f1ea76faaade2307ba542148ab62540ce5745ec"} err="failed to get container status \"5717f74437ab5d58e90e07278f1ea76faaade2307ba542148ab62540ce5745ec\": rpc error: code = NotFound desc = could not find container \"5717f74437ab5d58e90e07278f1ea76faaade2307ba542148ab62540ce5745ec\": container with ID starting with 5717f74437ab5d58e90e07278f1ea76faaade2307ba542148ab62540ce5745ec not found: ID does not exist" Dec 06 00:27:27 crc kubenswrapper[4734]: I1206 00:27:27.171834 4734 scope.go:117] "RemoveContainer" containerID="8a2379b40e47983e11f6b4d549b0f1fba9cdf9e679e0dd2f3e84ccde11a076fa" Dec 06 00:27:27 crc kubenswrapper[4734]: E1206 00:27:27.172845 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a2379b40e47983e11f6b4d549b0f1fba9cdf9e679e0dd2f3e84ccde11a076fa\": container with ID starting with 8a2379b40e47983e11f6b4d549b0f1fba9cdf9e679e0dd2f3e84ccde11a076fa not found: ID does not exist" containerID="8a2379b40e47983e11f6b4d549b0f1fba9cdf9e679e0dd2f3e84ccde11a076fa" Dec 06 00:27:27 crc kubenswrapper[4734]: I1206 00:27:27.172883 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a2379b40e47983e11f6b4d549b0f1fba9cdf9e679e0dd2f3e84ccde11a076fa"} err="failed to get container status \"8a2379b40e47983e11f6b4d549b0f1fba9cdf9e679e0dd2f3e84ccde11a076fa\": rpc error: code = NotFound desc = could not find container \"8a2379b40e47983e11f6b4d549b0f1fba9cdf9e679e0dd2f3e84ccde11a076fa\": container with ID starting with 8a2379b40e47983e11f6b4d549b0f1fba9cdf9e679e0dd2f3e84ccde11a076fa not found: ID does not exist" Dec 06 00:27:27 crc kubenswrapper[4734]: I1206 00:27:27.627743 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3a3ccd9-2bd4-40d8-817f-d7444a11610e" path="/var/lib/kubelet/pods/d3a3ccd9-2bd4-40d8-817f-d7444a11610e/volumes" Dec 06 00:27:27 crc kubenswrapper[4734]: I1206 00:27:27.628961 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd618706-8480-4795-936f-5a5524604fd7" path="/var/lib/kubelet/pods/dd618706-8480-4795-936f-5a5524604fd7/volumes" Dec 06 00:27:31 crc kubenswrapper[4734]: I1206 00:27:31.341158 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jnf5s/must-gather-hktf7"] Dec 06 00:27:31 crc kubenswrapper[4734]: I1206 00:27:31.341928 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-jnf5s/must-gather-hktf7" podUID="8d96ce40-7782-46c2-816b-6e792c06b2f1" containerName="copy" containerID="cri-o://41371b6cee252591242fc625ba513d815f17460aaec8ae08ca46e4cf73735cb7" gracePeriod=2 Dec 06 00:27:31 crc kubenswrapper[4734]: I1206 00:27:31.382058 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jnf5s/must-gather-hktf7"] Dec 06 00:27:31 crc kubenswrapper[4734]: I1206 00:27:31.827262 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jnf5s_must-gather-hktf7_8d96ce40-7782-46c2-816b-6e792c06b2f1/copy/0.log" Dec 06 00:27:31 crc kubenswrapper[4734]: I1206 00:27:31.828219 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jnf5s/must-gather-hktf7" Dec 06 00:27:31 crc kubenswrapper[4734]: I1206 00:27:31.940908 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8d96ce40-7782-46c2-816b-6e792c06b2f1-must-gather-output\") pod \"8d96ce40-7782-46c2-816b-6e792c06b2f1\" (UID: \"8d96ce40-7782-46c2-816b-6e792c06b2f1\") " Dec 06 00:27:31 crc kubenswrapper[4734]: I1206 00:27:31.941108 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82w7j\" (UniqueName: \"kubernetes.io/projected/8d96ce40-7782-46c2-816b-6e792c06b2f1-kube-api-access-82w7j\") pod \"8d96ce40-7782-46c2-816b-6e792c06b2f1\" (UID: \"8d96ce40-7782-46c2-816b-6e792c06b2f1\") " Dec 06 00:27:31 crc kubenswrapper[4734]: I1206 00:27:31.948480 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d96ce40-7782-46c2-816b-6e792c06b2f1-kube-api-access-82w7j" (OuterVolumeSpecName: "kube-api-access-82w7j") pod "8d96ce40-7782-46c2-816b-6e792c06b2f1" (UID: "8d96ce40-7782-46c2-816b-6e792c06b2f1"). InnerVolumeSpecName "kube-api-access-82w7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:27:32 crc kubenswrapper[4734]: I1206 00:27:32.028487 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jnf5s_must-gather-hktf7_8d96ce40-7782-46c2-816b-6e792c06b2f1/copy/0.log" Dec 06 00:27:32 crc kubenswrapper[4734]: I1206 00:27:32.028889 4734 generic.go:334] "Generic (PLEG): container finished" podID="8d96ce40-7782-46c2-816b-6e792c06b2f1" containerID="41371b6cee252591242fc625ba513d815f17460aaec8ae08ca46e4cf73735cb7" exitCode=143 Dec 06 00:27:32 crc kubenswrapper[4734]: I1206 00:27:32.028964 4734 scope.go:117] "RemoveContainer" containerID="41371b6cee252591242fc625ba513d815f17460aaec8ae08ca46e4cf73735cb7" Dec 06 00:27:32 crc kubenswrapper[4734]: I1206 00:27:32.029230 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jnf5s/must-gather-hktf7" Dec 06 00:27:32 crc kubenswrapper[4734]: I1206 00:27:32.044507 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82w7j\" (UniqueName: \"kubernetes.io/projected/8d96ce40-7782-46c2-816b-6e792c06b2f1-kube-api-access-82w7j\") on node \"crc\" DevicePath \"\"" Dec 06 00:27:32 crc kubenswrapper[4734]: I1206 00:27:32.056085 4734 scope.go:117] "RemoveContainer" containerID="6e3cbdf545de6ed0d0a79997cac3743afad5ce1e25b03bb3b6f513afcf5c6fad" Dec 06 00:27:32 crc kubenswrapper[4734]: I1206 00:27:32.095106 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d96ce40-7782-46c2-816b-6e792c06b2f1-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "8d96ce40-7782-46c2-816b-6e792c06b2f1" (UID: "8d96ce40-7782-46c2-816b-6e792c06b2f1"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:27:32 crc kubenswrapper[4734]: I1206 00:27:32.147274 4734 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8d96ce40-7782-46c2-816b-6e792c06b2f1-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 06 00:27:32 crc kubenswrapper[4734]: I1206 00:27:32.161648 4734 scope.go:117] "RemoveContainer" containerID="41371b6cee252591242fc625ba513d815f17460aaec8ae08ca46e4cf73735cb7" Dec 06 00:27:32 crc kubenswrapper[4734]: E1206 00:27:32.163060 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41371b6cee252591242fc625ba513d815f17460aaec8ae08ca46e4cf73735cb7\": container with ID starting with 41371b6cee252591242fc625ba513d815f17460aaec8ae08ca46e4cf73735cb7 not found: ID does not exist" containerID="41371b6cee252591242fc625ba513d815f17460aaec8ae08ca46e4cf73735cb7" Dec 06 00:27:32 crc kubenswrapper[4734]: I1206 00:27:32.163233 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41371b6cee252591242fc625ba513d815f17460aaec8ae08ca46e4cf73735cb7"} err="failed to get container status \"41371b6cee252591242fc625ba513d815f17460aaec8ae08ca46e4cf73735cb7\": rpc error: code = NotFound desc = could not find container \"41371b6cee252591242fc625ba513d815f17460aaec8ae08ca46e4cf73735cb7\": container with ID starting with 41371b6cee252591242fc625ba513d815f17460aaec8ae08ca46e4cf73735cb7 not found: ID does not exist" Dec 06 00:27:32 crc kubenswrapper[4734]: I1206 00:27:32.163285 4734 scope.go:117] "RemoveContainer" containerID="6e3cbdf545de6ed0d0a79997cac3743afad5ce1e25b03bb3b6f513afcf5c6fad" Dec 06 00:27:32 crc kubenswrapper[4734]: E1206 00:27:32.164434 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e3cbdf545de6ed0d0a79997cac3743afad5ce1e25b03bb3b6f513afcf5c6fad\": container with ID starting with 6e3cbdf545de6ed0d0a79997cac3743afad5ce1e25b03bb3b6f513afcf5c6fad not found: ID does not exist" containerID="6e3cbdf545de6ed0d0a79997cac3743afad5ce1e25b03bb3b6f513afcf5c6fad" Dec 06 00:27:32 crc kubenswrapper[4734]: I1206 00:27:32.164568 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e3cbdf545de6ed0d0a79997cac3743afad5ce1e25b03bb3b6f513afcf5c6fad"} err="failed to get container status \"6e3cbdf545de6ed0d0a79997cac3743afad5ce1e25b03bb3b6f513afcf5c6fad\": rpc error: code = NotFound desc = could not find container \"6e3cbdf545de6ed0d0a79997cac3743afad5ce1e25b03bb3b6f513afcf5c6fad\": container with ID starting with 6e3cbdf545de6ed0d0a79997cac3743afad5ce1e25b03bb3b6f513afcf5c6fad not found: ID does not exist" Dec 06 00:27:33 crc kubenswrapper[4734]: I1206 00:27:33.625512 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d96ce40-7782-46c2-816b-6e792c06b2f1" path="/var/lib/kubelet/pods/8d96ce40-7782-46c2-816b-6e792c06b2f1/volumes" Dec 06 00:28:20 crc kubenswrapper[4734]: I1206 00:28:20.445315 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 00:28:20 crc kubenswrapper[4734]: I1206 00:28:20.446192 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 00:28:50 crc kubenswrapper[4734]: I1206 00:28:50.445313 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 00:28:50 crc kubenswrapper[4734]: I1206 00:28:50.447485 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 00:28:57 crc kubenswrapper[4734]: I1206 00:28:57.459104 4734 scope.go:117] "RemoveContainer" containerID="bc969b76a100f0f9add329a6e69318987c7d1a46cd1f297a1af7dbce2ba20782" Dec 06 00:29:20 crc kubenswrapper[4734]: I1206 00:29:20.444777 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 00:29:20 crc kubenswrapper[4734]: I1206 00:29:20.445560 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 00:29:20 crc kubenswrapper[4734]: I1206 00:29:20.445620 4734 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 06 00:29:20 crc kubenswrapper[4734]: I1206 00:29:20.446624 4734 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"81278da5539b6ec2789a9334b6c891b6b36cb63a7e5b4a031f5c2f40b60a134e"} pod="openshift-machine-config-operator/machine-config-daemon-vn94d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 00:29:20 crc kubenswrapper[4734]: I1206 00:29:20.446690 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" containerID="cri-o://81278da5539b6ec2789a9334b6c891b6b36cb63a7e5b4a031f5c2f40b60a134e" gracePeriod=600 Dec 06 00:29:21 crc kubenswrapper[4734]: I1206 00:29:21.154056 4734 generic.go:334] "Generic (PLEG): container finished" podID="65758270-a7a7-46b5-af95-0588daf9fa86" containerID="81278da5539b6ec2789a9334b6c891b6b36cb63a7e5b4a031f5c2f40b60a134e" exitCode=0 Dec 06 00:29:21 crc kubenswrapper[4734]: I1206 00:29:21.154138 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerDied","Data":"81278da5539b6ec2789a9334b6c891b6b36cb63a7e5b4a031f5c2f40b60a134e"} Dec 06 00:29:21 crc kubenswrapper[4734]: I1206 00:29:21.154787 4734 scope.go:117] "RemoveContainer" containerID="30e2002c7070045c2b20466e69d767cf741f7938dac6f70dfac6c07537e1dd50" Dec 06 00:29:21 crc kubenswrapper[4734]: I1206 00:29:21.155805 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerStarted","Data":"defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d"} Dec 06 00:30:00 crc kubenswrapper[4734]: I1206 00:30:00.194056 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416350-w9tks"] Dec 06 00:30:00 crc kubenswrapper[4734]: E1206 00:30:00.195370 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d96ce40-7782-46c2-816b-6e792c06b2f1" containerName="copy" Dec 06 00:30:00 crc kubenswrapper[4734]: I1206 00:30:00.195389 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d96ce40-7782-46c2-816b-6e792c06b2f1" containerName="copy" Dec 06 00:30:00 crc kubenswrapper[4734]: E1206 00:30:00.195413 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3a3ccd9-2bd4-40d8-817f-d7444a11610e" containerName="registry-server" Dec 06 00:30:00 crc kubenswrapper[4734]: I1206 00:30:00.195419 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3a3ccd9-2bd4-40d8-817f-d7444a11610e" containerName="registry-server" Dec 06 00:30:00 crc kubenswrapper[4734]: E1206 00:30:00.195434 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd618706-8480-4795-936f-5a5524604fd7" containerName="extract-utilities" Dec 06 00:30:00 crc kubenswrapper[4734]: I1206 00:30:00.195441 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd618706-8480-4795-936f-5a5524604fd7" containerName="extract-utilities" Dec 06 00:30:00 crc kubenswrapper[4734]: E1206 00:30:00.195477 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3a3ccd9-2bd4-40d8-817f-d7444a11610e" containerName="extract-utilities" Dec 06 00:30:00 crc kubenswrapper[4734]: I1206 00:30:00.195486 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3a3ccd9-2bd4-40d8-817f-d7444a11610e" containerName="extract-utilities" Dec 06 00:30:00 crc kubenswrapper[4734]: E1206 00:30:00.195498 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd618706-8480-4795-936f-5a5524604fd7" containerName="extract-content" Dec 06 00:30:00 crc kubenswrapper[4734]: I1206 00:30:00.195505 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd618706-8480-4795-936f-5a5524604fd7" containerName="extract-content" Dec 06 00:30:00 crc kubenswrapper[4734]: E1206 00:30:00.195544 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3a3ccd9-2bd4-40d8-817f-d7444a11610e" containerName="extract-content" Dec 06 00:30:00 crc kubenswrapper[4734]: I1206 00:30:00.195553 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3a3ccd9-2bd4-40d8-817f-d7444a11610e" containerName="extract-content" Dec 06 00:30:00 crc kubenswrapper[4734]: E1206 00:30:00.195568 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d96ce40-7782-46c2-816b-6e792c06b2f1" containerName="gather" Dec 06 00:30:00 crc kubenswrapper[4734]: I1206 00:30:00.195575 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d96ce40-7782-46c2-816b-6e792c06b2f1" containerName="gather" Dec 06 00:30:00 crc kubenswrapper[4734]: E1206 00:30:00.195585 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd618706-8480-4795-936f-5a5524604fd7" containerName="registry-server" Dec 06 00:30:00 crc kubenswrapper[4734]: I1206 00:30:00.195592 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd618706-8480-4795-936f-5a5524604fd7" containerName="registry-server" Dec 06 00:30:00 crc kubenswrapper[4734]: I1206 00:30:00.195829 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d96ce40-7782-46c2-816b-6e792c06b2f1" containerName="gather" Dec 06 00:30:00 crc kubenswrapper[4734]: I1206 00:30:00.195851 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3a3ccd9-2bd4-40d8-817f-d7444a11610e" containerName="registry-server" Dec 06 00:30:00 crc kubenswrapper[4734]: I1206 00:30:00.195871 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d96ce40-7782-46c2-816b-6e792c06b2f1" containerName="copy" Dec 06 00:30:00 crc kubenswrapper[4734]: I1206 00:30:00.195880 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd618706-8480-4795-936f-5a5524604fd7" containerName="registry-server" Dec 06 00:30:00 crc kubenswrapper[4734]: I1206 00:30:00.196926 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416350-w9tks" Dec 06 00:30:00 crc kubenswrapper[4734]: I1206 00:30:00.201988 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 06 00:30:00 crc kubenswrapper[4734]: I1206 00:30:00.202206 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 06 00:30:00 crc kubenswrapper[4734]: I1206 00:30:00.225619 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416350-w9tks"] Dec 06 00:30:00 crc kubenswrapper[4734]: I1206 00:30:00.299614 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2xvv\" (UniqueName: \"kubernetes.io/projected/89cc37e6-40bb-412f-bad2-4a980419b0c8-kube-api-access-d2xvv\") pod \"collect-profiles-29416350-w9tks\" (UID: \"89cc37e6-40bb-412f-bad2-4a980419b0c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416350-w9tks" Dec 06 00:30:00 crc kubenswrapper[4734]: I1206 00:30:00.299871 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89cc37e6-40bb-412f-bad2-4a980419b0c8-config-volume\") pod \"collect-profiles-29416350-w9tks\" (UID: \"89cc37e6-40bb-412f-bad2-4a980419b0c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416350-w9tks" Dec 06 00:30:00 crc kubenswrapper[4734]: I1206 00:30:00.299955 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89cc37e6-40bb-412f-bad2-4a980419b0c8-secret-volume\") pod \"collect-profiles-29416350-w9tks\" (UID: \"89cc37e6-40bb-412f-bad2-4a980419b0c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416350-w9tks" Dec 06 00:30:00 crc kubenswrapper[4734]: I1206 00:30:00.402518 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89cc37e6-40bb-412f-bad2-4a980419b0c8-config-volume\") pod \"collect-profiles-29416350-w9tks\" (UID: \"89cc37e6-40bb-412f-bad2-4a980419b0c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416350-w9tks" Dec 06 00:30:00 crc kubenswrapper[4734]: I1206 00:30:00.402618 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89cc37e6-40bb-412f-bad2-4a980419b0c8-secret-volume\") pod \"collect-profiles-29416350-w9tks\" (UID: \"89cc37e6-40bb-412f-bad2-4a980419b0c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416350-w9tks" Dec 06 00:30:00 crc kubenswrapper[4734]: I1206 00:30:00.402766 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2xvv\" (UniqueName: \"kubernetes.io/projected/89cc37e6-40bb-412f-bad2-4a980419b0c8-kube-api-access-d2xvv\") pod \"collect-profiles-29416350-w9tks\" (UID: \"89cc37e6-40bb-412f-bad2-4a980419b0c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416350-w9tks" Dec 06 00:30:00 crc kubenswrapper[4734]: I1206 00:30:00.403547 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89cc37e6-40bb-412f-bad2-4a980419b0c8-config-volume\") pod \"collect-profiles-29416350-w9tks\" (UID: \"89cc37e6-40bb-412f-bad2-4a980419b0c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416350-w9tks" Dec 06 00:30:00 crc kubenswrapper[4734]: I1206 00:30:00.410585 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89cc37e6-40bb-412f-bad2-4a980419b0c8-secret-volume\") pod \"collect-profiles-29416350-w9tks\" (UID: \"89cc37e6-40bb-412f-bad2-4a980419b0c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416350-w9tks" Dec 06 00:30:00 crc kubenswrapper[4734]: I1206 00:30:00.424484 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2xvv\" (UniqueName: \"kubernetes.io/projected/89cc37e6-40bb-412f-bad2-4a980419b0c8-kube-api-access-d2xvv\") pod \"collect-profiles-29416350-w9tks\" (UID: \"89cc37e6-40bb-412f-bad2-4a980419b0c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29416350-w9tks" Dec 06 00:30:00 crc kubenswrapper[4734]: I1206 00:30:00.529904 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416350-w9tks" Dec 06 00:30:01 crc kubenswrapper[4734]: I1206 00:30:01.052421 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416350-w9tks"] Dec 06 00:30:01 crc kubenswrapper[4734]: I1206 00:30:01.605872 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416350-w9tks" event={"ID":"89cc37e6-40bb-412f-bad2-4a980419b0c8","Type":"ContainerStarted","Data":"0aaeb94824af6b9a1a9c4dfa0abbc681fea76e774cdabb35ab5a9388502ff511"} Dec 06 00:30:01 crc kubenswrapper[4734]: I1206 00:30:01.606298 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416350-w9tks" event={"ID":"89cc37e6-40bb-412f-bad2-4a980419b0c8","Type":"ContainerStarted","Data":"a2f5654e61a5158fa5c67a6734b6adfb23729d1c6fb89d080c912d377767ac9a"} Dec 06 00:30:02 crc kubenswrapper[4734]: I1206 00:30:02.617177 4734 generic.go:334] "Generic (PLEG): container finished" podID="89cc37e6-40bb-412f-bad2-4a980419b0c8" containerID="0aaeb94824af6b9a1a9c4dfa0abbc681fea76e774cdabb35ab5a9388502ff511" exitCode=0 Dec 06 00:30:02 crc kubenswrapper[4734]: I1206 00:30:02.617283 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416350-w9tks" event={"ID":"89cc37e6-40bb-412f-bad2-4a980419b0c8","Type":"ContainerDied","Data":"0aaeb94824af6b9a1a9c4dfa0abbc681fea76e774cdabb35ab5a9388502ff511"} Dec 06 00:30:03 crc kubenswrapper[4734]: I1206 00:30:03.975578 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416350-w9tks" Dec 06 00:30:04 crc kubenswrapper[4734]: I1206 00:30:04.092974 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2xvv\" (UniqueName: \"kubernetes.io/projected/89cc37e6-40bb-412f-bad2-4a980419b0c8-kube-api-access-d2xvv\") pod \"89cc37e6-40bb-412f-bad2-4a980419b0c8\" (UID: \"89cc37e6-40bb-412f-bad2-4a980419b0c8\") " Dec 06 00:30:04 crc kubenswrapper[4734]: I1206 00:30:04.093050 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89cc37e6-40bb-412f-bad2-4a980419b0c8-secret-volume\") pod \"89cc37e6-40bb-412f-bad2-4a980419b0c8\" (UID: \"89cc37e6-40bb-412f-bad2-4a980419b0c8\") " Dec 06 00:30:04 crc kubenswrapper[4734]: I1206 00:30:04.093256 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89cc37e6-40bb-412f-bad2-4a980419b0c8-config-volume\") pod \"89cc37e6-40bb-412f-bad2-4a980419b0c8\" (UID: \"89cc37e6-40bb-412f-bad2-4a980419b0c8\") " Dec 06 00:30:04 crc kubenswrapper[4734]: I1206 00:30:04.094450 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89cc37e6-40bb-412f-bad2-4a980419b0c8-config-volume" (OuterVolumeSpecName: "config-volume") pod "89cc37e6-40bb-412f-bad2-4a980419b0c8" (UID: "89cc37e6-40bb-412f-bad2-4a980419b0c8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 06 00:30:04 crc kubenswrapper[4734]: I1206 00:30:04.101887 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89cc37e6-40bb-412f-bad2-4a980419b0c8-kube-api-access-d2xvv" (OuterVolumeSpecName: "kube-api-access-d2xvv") pod "89cc37e6-40bb-412f-bad2-4a980419b0c8" (UID: "89cc37e6-40bb-412f-bad2-4a980419b0c8"). InnerVolumeSpecName "kube-api-access-d2xvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:30:04 crc kubenswrapper[4734]: I1206 00:30:04.102315 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89cc37e6-40bb-412f-bad2-4a980419b0c8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "89cc37e6-40bb-412f-bad2-4a980419b0c8" (UID: "89cc37e6-40bb-412f-bad2-4a980419b0c8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 06 00:30:04 crc kubenswrapper[4734]: I1206 00:30:04.198422 4734 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89cc37e6-40bb-412f-bad2-4a980419b0c8-config-volume\") on node \"crc\" DevicePath \"\"" Dec 06 00:30:04 crc kubenswrapper[4734]: I1206 00:30:04.198466 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2xvv\" (UniqueName: \"kubernetes.io/projected/89cc37e6-40bb-412f-bad2-4a980419b0c8-kube-api-access-d2xvv\") on node \"crc\" DevicePath \"\"" Dec 06 00:30:04 crc kubenswrapper[4734]: I1206 00:30:04.198478 4734 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/89cc37e6-40bb-412f-bad2-4a980419b0c8-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 06 00:30:04 crc kubenswrapper[4734]: I1206 00:30:04.643760 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29416350-w9tks" event={"ID":"89cc37e6-40bb-412f-bad2-4a980419b0c8","Type":"ContainerDied","Data":"a2f5654e61a5158fa5c67a6734b6adfb23729d1c6fb89d080c912d377767ac9a"} Dec 06 00:30:04 crc kubenswrapper[4734]: I1206 00:30:04.643824 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2f5654e61a5158fa5c67a6734b6adfb23729d1c6fb89d080c912d377767ac9a" Dec 06 00:30:04 crc kubenswrapper[4734]: I1206 00:30:04.643827 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29416350-w9tks" Dec 06 00:30:05 crc kubenswrapper[4734]: I1206 00:30:05.066431 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416305-th7b6"] Dec 06 00:30:05 crc kubenswrapper[4734]: I1206 00:30:05.075289 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29416305-th7b6"] Dec 06 00:30:05 crc kubenswrapper[4734]: I1206 00:30:05.625797 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4980d445-086f-4a87-9cfa-b5b4e6196a09" path="/var/lib/kubelet/pods/4980d445-086f-4a87-9cfa-b5b4e6196a09/volumes" Dec 06 00:30:37 crc kubenswrapper[4734]: I1206 00:30:37.564475 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5wnj2/must-gather-27zj4"] Dec 06 00:30:37 crc kubenswrapper[4734]: E1206 00:30:37.569633 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89cc37e6-40bb-412f-bad2-4a980419b0c8" containerName="collect-profiles" Dec 06 00:30:37 crc kubenswrapper[4734]: I1206 00:30:37.569664 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="89cc37e6-40bb-412f-bad2-4a980419b0c8" containerName="collect-profiles" Dec 06 00:30:37 crc kubenswrapper[4734]: I1206 00:30:37.569925 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="89cc37e6-40bb-412f-bad2-4a980419b0c8" containerName="collect-profiles" Dec 06 00:30:37 crc kubenswrapper[4734]: I1206 00:30:37.571408 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5wnj2/must-gather-27zj4" Dec 06 00:30:37 crc kubenswrapper[4734]: I1206 00:30:37.576065 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5wnj2"/"openshift-service-ca.crt" Dec 06 00:30:37 crc kubenswrapper[4734]: I1206 00:30:37.576205 4734 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5wnj2"/"kube-root-ca.crt" Dec 06 00:30:37 crc kubenswrapper[4734]: I1206 00:30:37.604337 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvw5g\" (UniqueName: \"kubernetes.io/projected/7437b2ff-0f14-41c8-a613-f583ed483d0b-kube-api-access-bvw5g\") pod \"must-gather-27zj4\" (UID: \"7437b2ff-0f14-41c8-a613-f583ed483d0b\") " pod="openshift-must-gather-5wnj2/must-gather-27zj4" Dec 06 00:30:37 crc kubenswrapper[4734]: I1206 00:30:37.604475 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7437b2ff-0f14-41c8-a613-f583ed483d0b-must-gather-output\") pod \"must-gather-27zj4\" (UID: \"7437b2ff-0f14-41c8-a613-f583ed483d0b\") " pod="openshift-must-gather-5wnj2/must-gather-27zj4" Dec 06 00:30:37 crc kubenswrapper[4734]: I1206 00:30:37.677764 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5wnj2/must-gather-27zj4"] Dec 06 00:30:37 crc kubenswrapper[4734]: I1206 00:30:37.709508 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvw5g\" (UniqueName: \"kubernetes.io/projected/7437b2ff-0f14-41c8-a613-f583ed483d0b-kube-api-access-bvw5g\") pod \"must-gather-27zj4\" (UID: \"7437b2ff-0f14-41c8-a613-f583ed483d0b\") " pod="openshift-must-gather-5wnj2/must-gather-27zj4" Dec 06 00:30:37 crc kubenswrapper[4734]: I1206 00:30:37.709651 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7437b2ff-0f14-41c8-a613-f583ed483d0b-must-gather-output\") pod \"must-gather-27zj4\" (UID: \"7437b2ff-0f14-41c8-a613-f583ed483d0b\") " pod="openshift-must-gather-5wnj2/must-gather-27zj4" Dec 06 00:30:37 crc kubenswrapper[4734]: I1206 00:30:37.710136 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7437b2ff-0f14-41c8-a613-f583ed483d0b-must-gather-output\") pod \"must-gather-27zj4\" (UID: \"7437b2ff-0f14-41c8-a613-f583ed483d0b\") " pod="openshift-must-gather-5wnj2/must-gather-27zj4" Dec 06 00:30:37 crc kubenswrapper[4734]: I1206 00:30:37.738047 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvw5g\" (UniqueName: \"kubernetes.io/projected/7437b2ff-0f14-41c8-a613-f583ed483d0b-kube-api-access-bvw5g\") pod \"must-gather-27zj4\" (UID: \"7437b2ff-0f14-41c8-a613-f583ed483d0b\") " pod="openshift-must-gather-5wnj2/must-gather-27zj4" Dec 06 00:30:37 crc kubenswrapper[4734]: I1206 00:30:37.909445 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5wnj2/must-gather-27zj4" Dec 06 00:30:38 crc kubenswrapper[4734]: I1206 00:30:38.407239 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5wnj2/must-gather-27zj4"] Dec 06 00:30:39 crc kubenswrapper[4734]: I1206 00:30:39.081734 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5wnj2/must-gather-27zj4" event={"ID":"7437b2ff-0f14-41c8-a613-f583ed483d0b","Type":"ContainerStarted","Data":"898f813609ab9f00ad5549e0f4340a97f480bfdc392fb6dc3739b2e9fa399973"} Dec 06 00:30:40 crc kubenswrapper[4734]: I1206 00:30:40.097565 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5wnj2/must-gather-27zj4" event={"ID":"7437b2ff-0f14-41c8-a613-f583ed483d0b","Type":"ContainerStarted","Data":"1558380f7380e17cc5d139b70624a126d116b0e81ff5b2fcb94ffaa10c496388"} Dec 06 00:30:41 crc kubenswrapper[4734]: I1206 00:30:41.128662 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5wnj2/must-gather-27zj4" event={"ID":"7437b2ff-0f14-41c8-a613-f583ed483d0b","Type":"ContainerStarted","Data":"b640db05c6941c22238bfe8ece15168ed1e93842773a8339239cbeb90a50c2b0"} Dec 06 00:30:41 crc kubenswrapper[4734]: I1206 00:30:41.159160 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5wnj2/must-gather-27zj4" podStartSLOduration=4.1591297130000005 podStartE2EDuration="4.159129713s" podCreationTimestamp="2025-12-06 00:30:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 00:30:41.158315193 +0000 UTC m=+4261.841719469" watchObservedRunningTime="2025-12-06 00:30:41.159129713 +0000 UTC m=+4261.842533989" Dec 06 00:30:44 crc kubenswrapper[4734]: I1206 00:30:44.415206 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5wnj2/crc-debug-p7hvf"] Dec 06 00:30:44 crc kubenswrapper[4734]: I1206 00:30:44.417170 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5wnj2/crc-debug-p7hvf" Dec 06 00:30:44 crc kubenswrapper[4734]: I1206 00:30:44.420129 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-5wnj2"/"default-dockercfg-8qrhp" Dec 06 00:30:44 crc kubenswrapper[4734]: I1206 00:30:44.479399 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ecfd21d3-94cf-4ee9-b7de-11c94c9bc686-host\") pod \"crc-debug-p7hvf\" (UID: \"ecfd21d3-94cf-4ee9-b7de-11c94c9bc686\") " pod="openshift-must-gather-5wnj2/crc-debug-p7hvf" Dec 06 00:30:44 crc kubenswrapper[4734]: I1206 00:30:44.479876 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx52h\" (UniqueName: \"kubernetes.io/projected/ecfd21d3-94cf-4ee9-b7de-11c94c9bc686-kube-api-access-gx52h\") pod \"crc-debug-p7hvf\" (UID: \"ecfd21d3-94cf-4ee9-b7de-11c94c9bc686\") " pod="openshift-must-gather-5wnj2/crc-debug-p7hvf" Dec 06 00:30:44 crc kubenswrapper[4734]: I1206 00:30:44.581952 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx52h\" (UniqueName: \"kubernetes.io/projected/ecfd21d3-94cf-4ee9-b7de-11c94c9bc686-kube-api-access-gx52h\") pod \"crc-debug-p7hvf\" (UID: \"ecfd21d3-94cf-4ee9-b7de-11c94c9bc686\") " pod="openshift-must-gather-5wnj2/crc-debug-p7hvf" Dec 06 00:30:44 crc kubenswrapper[4734]: I1206 00:30:44.582077 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ecfd21d3-94cf-4ee9-b7de-11c94c9bc686-host\") pod \"crc-debug-p7hvf\" (UID: \"ecfd21d3-94cf-4ee9-b7de-11c94c9bc686\") " pod="openshift-must-gather-5wnj2/crc-debug-p7hvf" Dec 06 00:30:44 crc kubenswrapper[4734]: I1206 00:30:44.582220 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ecfd21d3-94cf-4ee9-b7de-11c94c9bc686-host\") pod \"crc-debug-p7hvf\" (UID: \"ecfd21d3-94cf-4ee9-b7de-11c94c9bc686\") " pod="openshift-must-gather-5wnj2/crc-debug-p7hvf" Dec 06 00:30:44 crc kubenswrapper[4734]: I1206 00:30:44.605134 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx52h\" (UniqueName: \"kubernetes.io/projected/ecfd21d3-94cf-4ee9-b7de-11c94c9bc686-kube-api-access-gx52h\") pod \"crc-debug-p7hvf\" (UID: \"ecfd21d3-94cf-4ee9-b7de-11c94c9bc686\") " pod="openshift-must-gather-5wnj2/crc-debug-p7hvf" Dec 06 00:30:44 crc kubenswrapper[4734]: I1206 00:30:44.742710 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5wnj2/crc-debug-p7hvf" Dec 06 00:30:45 crc kubenswrapper[4734]: I1206 00:30:45.184024 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5wnj2/crc-debug-p7hvf" event={"ID":"ecfd21d3-94cf-4ee9-b7de-11c94c9bc686","Type":"ContainerStarted","Data":"9fddd84121db8cbae67843a342f39fc1374895f1481946815a993459c87bd8f5"} Dec 06 00:30:45 crc kubenswrapper[4734]: I1206 00:30:45.185275 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5wnj2/crc-debug-p7hvf" event={"ID":"ecfd21d3-94cf-4ee9-b7de-11c94c9bc686","Type":"ContainerStarted","Data":"b9ef8b9b933fb37174df84ecfaa7046ec913bdf89383ef171a780be1c53a2c6d"} Dec 06 00:30:45 crc kubenswrapper[4734]: I1206 00:30:45.218705 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5wnj2/crc-debug-p7hvf" podStartSLOduration=1.218683121 podStartE2EDuration="1.218683121s" podCreationTimestamp="2025-12-06 00:30:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-06 00:30:45.206492702 +0000 UTC m=+4265.889896978" watchObservedRunningTime="2025-12-06 00:30:45.218683121 +0000 UTC m=+4265.902087387" Dec 06 00:30:57 crc kubenswrapper[4734]: I1206 00:30:57.557131 4734 scope.go:117] "RemoveContainer" containerID="239f2595428af802099ded42c3311bee633818cade65a8719e9bbbeed3d0f823" Dec 06 00:31:19 crc kubenswrapper[4734]: I1206 00:31:19.553196 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4zzjd"] Dec 06 00:31:19 crc kubenswrapper[4734]: I1206 00:31:19.562854 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4zzjd" Dec 06 00:31:19 crc kubenswrapper[4734]: I1206 00:31:19.593107 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4zzjd"] Dec 06 00:31:19 crc kubenswrapper[4734]: I1206 00:31:19.763454 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3b374c0-9943-4407-a015-6f596bc75fe1-utilities\") pod \"community-operators-4zzjd\" (UID: \"b3b374c0-9943-4407-a015-6f596bc75fe1\") " pod="openshift-marketplace/community-operators-4zzjd" Dec 06 00:31:19 crc kubenswrapper[4734]: I1206 00:31:19.763978 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txtvq\" (UniqueName: \"kubernetes.io/projected/b3b374c0-9943-4407-a015-6f596bc75fe1-kube-api-access-txtvq\") pod \"community-operators-4zzjd\" (UID: \"b3b374c0-9943-4407-a015-6f596bc75fe1\") " pod="openshift-marketplace/community-operators-4zzjd" Dec 06 00:31:19 crc kubenswrapper[4734]: I1206 00:31:19.764182 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3b374c0-9943-4407-a015-6f596bc75fe1-catalog-content\") pod \"community-operators-4zzjd\" (UID: \"b3b374c0-9943-4407-a015-6f596bc75fe1\") " pod="openshift-marketplace/community-operators-4zzjd" Dec 06 00:31:19 crc kubenswrapper[4734]: I1206 00:31:19.866154 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3b374c0-9943-4407-a015-6f596bc75fe1-catalog-content\") pod \"community-operators-4zzjd\" (UID: \"b3b374c0-9943-4407-a015-6f596bc75fe1\") " pod="openshift-marketplace/community-operators-4zzjd" Dec 06 00:31:19 crc kubenswrapper[4734]: I1206 00:31:19.866672 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3b374c0-9943-4407-a015-6f596bc75fe1-utilities\") pod \"community-operators-4zzjd\" (UID: \"b3b374c0-9943-4407-a015-6f596bc75fe1\") " pod="openshift-marketplace/community-operators-4zzjd" Dec 06 00:31:19 crc kubenswrapper[4734]: I1206 00:31:19.866840 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txtvq\" (UniqueName: \"kubernetes.io/projected/b3b374c0-9943-4407-a015-6f596bc75fe1-kube-api-access-txtvq\") pod \"community-operators-4zzjd\" (UID: \"b3b374c0-9943-4407-a015-6f596bc75fe1\") " pod="openshift-marketplace/community-operators-4zzjd" Dec 06 00:31:19 crc kubenswrapper[4734]: I1206 00:31:19.867791 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3b374c0-9943-4407-a015-6f596bc75fe1-catalog-content\") pod \"community-operators-4zzjd\" (UID: \"b3b374c0-9943-4407-a015-6f596bc75fe1\") " pod="openshift-marketplace/community-operators-4zzjd" Dec 06 00:31:19 crc kubenswrapper[4734]: I1206 00:31:19.868096 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3b374c0-9943-4407-a015-6f596bc75fe1-utilities\") pod \"community-operators-4zzjd\" (UID: \"b3b374c0-9943-4407-a015-6f596bc75fe1\") " pod="openshift-marketplace/community-operators-4zzjd" Dec 06 00:31:19 crc kubenswrapper[4734]: I1206 00:31:19.890488 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txtvq\" (UniqueName: \"kubernetes.io/projected/b3b374c0-9943-4407-a015-6f596bc75fe1-kube-api-access-txtvq\") pod \"community-operators-4zzjd\" (UID: \"b3b374c0-9943-4407-a015-6f596bc75fe1\") " pod="openshift-marketplace/community-operators-4zzjd" Dec 06 00:31:19 crc kubenswrapper[4734]: I1206 00:31:19.906284 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4zzjd" Dec 06 00:31:20 crc kubenswrapper[4734]: I1206 00:31:20.508928 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4zzjd"] Dec 06 00:31:20 crc kubenswrapper[4734]: I1206 00:31:20.549580 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zzjd" event={"ID":"b3b374c0-9943-4407-a015-6f596bc75fe1","Type":"ContainerStarted","Data":"9761f5cdfd6a9dd9b067e2b0775824c98640c92ee7156d4cb4b150192b2f463c"} Dec 06 00:31:21 crc kubenswrapper[4734]: I1206 00:31:21.562711 4734 generic.go:334] "Generic (PLEG): container finished" podID="b3b374c0-9943-4407-a015-6f596bc75fe1" containerID="9c749bf339abd4e99b9cccb7633ae8012dcae20e4de196bc37ef1638343c2b96" exitCode=0 Dec 06 00:31:21 crc kubenswrapper[4734]: I1206 00:31:21.562778 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zzjd" event={"ID":"b3b374c0-9943-4407-a015-6f596bc75fe1","Type":"ContainerDied","Data":"9c749bf339abd4e99b9cccb7633ae8012dcae20e4de196bc37ef1638343c2b96"} Dec 06 00:31:22 crc kubenswrapper[4734]: I1206 00:31:22.575088 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zzjd" event={"ID":"b3b374c0-9943-4407-a015-6f596bc75fe1","Type":"ContainerStarted","Data":"4b2b9ac49ca5e93bbdc06d62e99152072e44438e7b38114ee8d0165fc19751e1"} Dec 06 00:31:23 crc kubenswrapper[4734]: I1206 00:31:23.587099 4734 generic.go:334] "Generic (PLEG): container finished" podID="b3b374c0-9943-4407-a015-6f596bc75fe1" containerID="4b2b9ac49ca5e93bbdc06d62e99152072e44438e7b38114ee8d0165fc19751e1" exitCode=0 Dec 06 00:31:23 crc kubenswrapper[4734]: I1206 00:31:23.587197 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zzjd" event={"ID":"b3b374c0-9943-4407-a015-6f596bc75fe1","Type":"ContainerDied","Data":"4b2b9ac49ca5e93bbdc06d62e99152072e44438e7b38114ee8d0165fc19751e1"} Dec 06 00:31:24 crc kubenswrapper[4734]: I1206 00:31:24.599215 4734 generic.go:334] "Generic (PLEG): container finished" podID="ecfd21d3-94cf-4ee9-b7de-11c94c9bc686" containerID="9fddd84121db8cbae67843a342f39fc1374895f1481946815a993459c87bd8f5" exitCode=0 Dec 06 00:31:24 crc kubenswrapper[4734]: I1206 00:31:24.599315 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5wnj2/crc-debug-p7hvf" event={"ID":"ecfd21d3-94cf-4ee9-b7de-11c94c9bc686","Type":"ContainerDied","Data":"9fddd84121db8cbae67843a342f39fc1374895f1481946815a993459c87bd8f5"} Dec 06 00:31:24 crc kubenswrapper[4734]: I1206 00:31:24.604268 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zzjd" event={"ID":"b3b374c0-9943-4407-a015-6f596bc75fe1","Type":"ContainerStarted","Data":"91e98863e38b023aac3047176d55b0fd49e557ebffd1a88b2264ac9fa46aa04f"} Dec 06 00:31:24 crc kubenswrapper[4734]: I1206 00:31:24.663059 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4zzjd" podStartSLOduration=3.220328591 podStartE2EDuration="5.663032503s" podCreationTimestamp="2025-12-06 00:31:19 +0000 UTC" firstStartedPulling="2025-12-06 00:31:21.565493417 +0000 UTC m=+4302.248897693" lastFinishedPulling="2025-12-06 00:31:24.008197329 +0000 UTC m=+4304.691601605" observedRunningTime="2025-12-06 00:31:24.661739991 +0000 UTC m=+4305.345144267" watchObservedRunningTime="2025-12-06 00:31:24.663032503 +0000 UTC m=+4305.346436779" Dec 06 00:31:25 crc kubenswrapper[4734]: I1206 00:31:25.740800 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5wnj2/crc-debug-p7hvf" Dec 06 00:31:25 crc kubenswrapper[4734]: I1206 00:31:25.782275 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5wnj2/crc-debug-p7hvf"] Dec 06 00:31:25 crc kubenswrapper[4734]: I1206 00:31:25.806444 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5wnj2/crc-debug-p7hvf"] Dec 06 00:31:25 crc kubenswrapper[4734]: I1206 00:31:25.813156 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ecfd21d3-94cf-4ee9-b7de-11c94c9bc686-host\") pod \"ecfd21d3-94cf-4ee9-b7de-11c94c9bc686\" (UID: \"ecfd21d3-94cf-4ee9-b7de-11c94c9bc686\") " Dec 06 00:31:25 crc kubenswrapper[4734]: I1206 00:31:25.813233 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx52h\" (UniqueName: \"kubernetes.io/projected/ecfd21d3-94cf-4ee9-b7de-11c94c9bc686-kube-api-access-gx52h\") pod \"ecfd21d3-94cf-4ee9-b7de-11c94c9bc686\" (UID: \"ecfd21d3-94cf-4ee9-b7de-11c94c9bc686\") " Dec 06 00:31:25 crc kubenswrapper[4734]: I1206 00:31:25.813279 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ecfd21d3-94cf-4ee9-b7de-11c94c9bc686-host" (OuterVolumeSpecName: "host") pod "ecfd21d3-94cf-4ee9-b7de-11c94c9bc686" (UID: "ecfd21d3-94cf-4ee9-b7de-11c94c9bc686"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 00:31:25 crc kubenswrapper[4734]: I1206 00:31:25.813828 4734 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ecfd21d3-94cf-4ee9-b7de-11c94c9bc686-host\") on node \"crc\" DevicePath \"\"" Dec 06 00:31:25 crc kubenswrapper[4734]: I1206 00:31:25.827986 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecfd21d3-94cf-4ee9-b7de-11c94c9bc686-kube-api-access-gx52h" (OuterVolumeSpecName: "kube-api-access-gx52h") pod "ecfd21d3-94cf-4ee9-b7de-11c94c9bc686" (UID: "ecfd21d3-94cf-4ee9-b7de-11c94c9bc686"). InnerVolumeSpecName "kube-api-access-gx52h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:31:25 crc kubenswrapper[4734]: I1206 00:31:25.916433 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx52h\" (UniqueName: \"kubernetes.io/projected/ecfd21d3-94cf-4ee9-b7de-11c94c9bc686-kube-api-access-gx52h\") on node \"crc\" DevicePath \"\"" Dec 06 00:31:26 crc kubenswrapper[4734]: I1206 00:31:26.639909 4734 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9ef8b9b933fb37174df84ecfaa7046ec913bdf89383ef171a780be1c53a2c6d" Dec 06 00:31:26 crc kubenswrapper[4734]: I1206 00:31:26.640044 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5wnj2/crc-debug-p7hvf" Dec 06 00:31:27 crc kubenswrapper[4734]: I1206 00:31:27.053566 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5wnj2/crc-debug-4fw4n"] Dec 06 00:31:27 crc kubenswrapper[4734]: E1206 00:31:27.054561 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecfd21d3-94cf-4ee9-b7de-11c94c9bc686" containerName="container-00" Dec 06 00:31:27 crc kubenswrapper[4734]: I1206 00:31:27.054582 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecfd21d3-94cf-4ee9-b7de-11c94c9bc686" containerName="container-00" Dec 06 00:31:27 crc kubenswrapper[4734]: I1206 00:31:27.054905 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecfd21d3-94cf-4ee9-b7de-11c94c9bc686" containerName="container-00" Dec 06 00:31:27 crc kubenswrapper[4734]: I1206 00:31:27.055810 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5wnj2/crc-debug-4fw4n" Dec 06 00:31:27 crc kubenswrapper[4734]: I1206 00:31:27.058254 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-5wnj2"/"default-dockercfg-8qrhp" Dec 06 00:31:27 crc kubenswrapper[4734]: I1206 00:31:27.152247 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/afeca3e5-be11-4cfa-bcbd-4f59eb39aa76-host\") pod \"crc-debug-4fw4n\" (UID: \"afeca3e5-be11-4cfa-bcbd-4f59eb39aa76\") " pod="openshift-must-gather-5wnj2/crc-debug-4fw4n" Dec 06 00:31:27 crc kubenswrapper[4734]: I1206 00:31:27.152711 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgpwr\" (UniqueName: \"kubernetes.io/projected/afeca3e5-be11-4cfa-bcbd-4f59eb39aa76-kube-api-access-rgpwr\") pod \"crc-debug-4fw4n\" (UID: \"afeca3e5-be11-4cfa-bcbd-4f59eb39aa76\") " pod="openshift-must-gather-5wnj2/crc-debug-4fw4n" Dec 06 00:31:27 crc kubenswrapper[4734]: I1206 00:31:27.254746 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgpwr\" (UniqueName: \"kubernetes.io/projected/afeca3e5-be11-4cfa-bcbd-4f59eb39aa76-kube-api-access-rgpwr\") pod \"crc-debug-4fw4n\" (UID: \"afeca3e5-be11-4cfa-bcbd-4f59eb39aa76\") " pod="openshift-must-gather-5wnj2/crc-debug-4fw4n" Dec 06 00:31:27 crc kubenswrapper[4734]: I1206 00:31:27.254916 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/afeca3e5-be11-4cfa-bcbd-4f59eb39aa76-host\") pod \"crc-debug-4fw4n\" (UID: \"afeca3e5-be11-4cfa-bcbd-4f59eb39aa76\") " pod="openshift-must-gather-5wnj2/crc-debug-4fw4n" Dec 06 00:31:27 crc kubenswrapper[4734]: I1206 00:31:27.255069 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/afeca3e5-be11-4cfa-bcbd-4f59eb39aa76-host\") pod \"crc-debug-4fw4n\" (UID: \"afeca3e5-be11-4cfa-bcbd-4f59eb39aa76\") " pod="openshift-must-gather-5wnj2/crc-debug-4fw4n" Dec 06 00:31:27 crc kubenswrapper[4734]: I1206 00:31:27.275673 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgpwr\" (UniqueName: \"kubernetes.io/projected/afeca3e5-be11-4cfa-bcbd-4f59eb39aa76-kube-api-access-rgpwr\") pod \"crc-debug-4fw4n\" (UID: \"afeca3e5-be11-4cfa-bcbd-4f59eb39aa76\") " pod="openshift-must-gather-5wnj2/crc-debug-4fw4n" Dec 06 00:31:27 crc kubenswrapper[4734]: I1206 00:31:27.379483 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5wnj2/crc-debug-4fw4n" Dec 06 00:31:27 crc kubenswrapper[4734]: W1206 00:31:27.416491 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafeca3e5_be11_4cfa_bcbd_4f59eb39aa76.slice/crio-33193ba95f73c46b7e8da20b20520dc19e3c1e09510bf7db10e4dce42944baf2 WatchSource:0}: Error finding container 33193ba95f73c46b7e8da20b20520dc19e3c1e09510bf7db10e4dce42944baf2: Status 404 returned error can't find the container with id 33193ba95f73c46b7e8da20b20520dc19e3c1e09510bf7db10e4dce42944baf2 Dec 06 00:31:27 crc kubenswrapper[4734]: I1206 00:31:27.629623 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecfd21d3-94cf-4ee9-b7de-11c94c9bc686" path="/var/lib/kubelet/pods/ecfd21d3-94cf-4ee9-b7de-11c94c9bc686/volumes" Dec 06 00:31:27 crc kubenswrapper[4734]: I1206 00:31:27.650662 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5wnj2/crc-debug-4fw4n" event={"ID":"afeca3e5-be11-4cfa-bcbd-4f59eb39aa76","Type":"ContainerStarted","Data":"33193ba95f73c46b7e8da20b20520dc19e3c1e09510bf7db10e4dce42944baf2"} Dec 06 00:31:28 crc kubenswrapper[4734]: I1206 00:31:28.675498 4734 generic.go:334] "Generic (PLEG): container finished" podID="afeca3e5-be11-4cfa-bcbd-4f59eb39aa76" containerID="a4da6144cae1b42bf300b0dc40967951b0c3e13fddef3b167a466db097237b77" exitCode=0 Dec 06 00:31:28 crc kubenswrapper[4734]: I1206 00:31:28.675703 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5wnj2/crc-debug-4fw4n" event={"ID":"afeca3e5-be11-4cfa-bcbd-4f59eb39aa76","Type":"ContainerDied","Data":"a4da6144cae1b42bf300b0dc40967951b0c3e13fddef3b167a466db097237b77"} Dec 06 00:31:29 crc kubenswrapper[4734]: I1206 00:31:29.149649 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5wnj2/crc-debug-4fw4n"] Dec 06 00:31:29 crc kubenswrapper[4734]: I1206 00:31:29.159743 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5wnj2/crc-debug-4fw4n"] Dec 06 00:31:29 crc kubenswrapper[4734]: I1206 00:31:29.792418 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5wnj2/crc-debug-4fw4n" Dec 06 00:31:29 crc kubenswrapper[4734]: I1206 00:31:29.907357 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4zzjd" Dec 06 00:31:29 crc kubenswrapper[4734]: I1206 00:31:29.907437 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4zzjd" Dec 06 00:31:29 crc kubenswrapper[4734]: I1206 00:31:29.915054 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgpwr\" (UniqueName: \"kubernetes.io/projected/afeca3e5-be11-4cfa-bcbd-4f59eb39aa76-kube-api-access-rgpwr\") pod \"afeca3e5-be11-4cfa-bcbd-4f59eb39aa76\" (UID: \"afeca3e5-be11-4cfa-bcbd-4f59eb39aa76\") " Dec 06 00:31:29 crc kubenswrapper[4734]: I1206 00:31:29.915136 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/afeca3e5-be11-4cfa-bcbd-4f59eb39aa76-host\") pod \"afeca3e5-be11-4cfa-bcbd-4f59eb39aa76\" (UID: \"afeca3e5-be11-4cfa-bcbd-4f59eb39aa76\") " Dec 06 00:31:29 crc kubenswrapper[4734]: I1206 00:31:29.915866 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afeca3e5-be11-4cfa-bcbd-4f59eb39aa76-host" (OuterVolumeSpecName: "host") pod "afeca3e5-be11-4cfa-bcbd-4f59eb39aa76" (UID: "afeca3e5-be11-4cfa-bcbd-4f59eb39aa76"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 00:31:29 crc kubenswrapper[4734]: I1206 00:31:29.917081 4734 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/afeca3e5-be11-4cfa-bcbd-4f59eb39aa76-host\") on node \"crc\" DevicePath \"\"" Dec 06 00:31:29 crc kubenswrapper[4734]: I1206 00:31:29.922045 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afeca3e5-be11-4cfa-bcbd-4f59eb39aa76-kube-api-access-rgpwr" (OuterVolumeSpecName: "kube-api-access-rgpwr") pod "afeca3e5-be11-4cfa-bcbd-4f59eb39aa76" (UID: "afeca3e5-be11-4cfa-bcbd-4f59eb39aa76"). InnerVolumeSpecName "kube-api-access-rgpwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:31:29 crc kubenswrapper[4734]: I1206 00:31:29.967713 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4zzjd" Dec 06 00:31:30 crc kubenswrapper[4734]: I1206 00:31:30.020889 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgpwr\" (UniqueName: \"kubernetes.io/projected/afeca3e5-be11-4cfa-bcbd-4f59eb39aa76-kube-api-access-rgpwr\") on node \"crc\" DevicePath \"\"" Dec 06 00:31:30 crc kubenswrapper[4734]: I1206 00:31:30.698669 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5wnj2/crc-debug-4fw4n" Dec 06 00:31:30 crc kubenswrapper[4734]: I1206 00:31:30.698710 4734 scope.go:117] "RemoveContainer" containerID="a4da6144cae1b42bf300b0dc40967951b0c3e13fddef3b167a466db097237b77" Dec 06 00:31:30 crc kubenswrapper[4734]: I1206 00:31:30.775507 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4zzjd" Dec 06 00:31:30 crc kubenswrapper[4734]: I1206 00:31:30.852259 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4zzjd"] Dec 06 00:31:30 crc kubenswrapper[4734]: I1206 00:31:30.896657 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5wnj2/crc-debug-nzccb"] Dec 06 00:31:30 crc kubenswrapper[4734]: E1206 00:31:30.897181 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afeca3e5-be11-4cfa-bcbd-4f59eb39aa76" containerName="container-00" Dec 06 00:31:30 crc kubenswrapper[4734]: I1206 00:31:30.897203 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="afeca3e5-be11-4cfa-bcbd-4f59eb39aa76" containerName="container-00" Dec 06 00:31:30 crc kubenswrapper[4734]: I1206 00:31:30.897426 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="afeca3e5-be11-4cfa-bcbd-4f59eb39aa76" containerName="container-00" Dec 06 00:31:30 crc kubenswrapper[4734]: I1206 00:31:30.898150 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5wnj2/crc-debug-nzccb" Dec 06 00:31:30 crc kubenswrapper[4734]: I1206 00:31:30.908712 4734 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-5wnj2"/"default-dockercfg-8qrhp" Dec 06 00:31:31 crc kubenswrapper[4734]: I1206 00:31:31.044584 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af200ff3-145c-4b87-9184-9bcad46ae2f0-host\") pod \"crc-debug-nzccb\" (UID: \"af200ff3-145c-4b87-9184-9bcad46ae2f0\") " pod="openshift-must-gather-5wnj2/crc-debug-nzccb" Dec 06 00:31:31 crc kubenswrapper[4734]: I1206 00:31:31.047043 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldgtp\" (UniqueName: \"kubernetes.io/projected/af200ff3-145c-4b87-9184-9bcad46ae2f0-kube-api-access-ldgtp\") pod \"crc-debug-nzccb\" (UID: \"af200ff3-145c-4b87-9184-9bcad46ae2f0\") " pod="openshift-must-gather-5wnj2/crc-debug-nzccb" Dec 06 00:31:31 crc kubenswrapper[4734]: I1206 00:31:31.149399 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldgtp\" (UniqueName: \"kubernetes.io/projected/af200ff3-145c-4b87-9184-9bcad46ae2f0-kube-api-access-ldgtp\") pod \"crc-debug-nzccb\" (UID: \"af200ff3-145c-4b87-9184-9bcad46ae2f0\") " pod="openshift-must-gather-5wnj2/crc-debug-nzccb" Dec 06 00:31:31 crc kubenswrapper[4734]: I1206 00:31:31.152784 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af200ff3-145c-4b87-9184-9bcad46ae2f0-host\") pod \"crc-debug-nzccb\" (UID: \"af200ff3-145c-4b87-9184-9bcad46ae2f0\") " pod="openshift-must-gather-5wnj2/crc-debug-nzccb" Dec 06 00:31:31 crc kubenswrapper[4734]: I1206 00:31:31.153079 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af200ff3-145c-4b87-9184-9bcad46ae2f0-host\") pod \"crc-debug-nzccb\" (UID: \"af200ff3-145c-4b87-9184-9bcad46ae2f0\") " pod="openshift-must-gather-5wnj2/crc-debug-nzccb" Dec 06 00:31:31 crc kubenswrapper[4734]: I1206 00:31:31.171872 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldgtp\" (UniqueName: \"kubernetes.io/projected/af200ff3-145c-4b87-9184-9bcad46ae2f0-kube-api-access-ldgtp\") pod \"crc-debug-nzccb\" (UID: \"af200ff3-145c-4b87-9184-9bcad46ae2f0\") " pod="openshift-must-gather-5wnj2/crc-debug-nzccb" Dec 06 00:31:31 crc kubenswrapper[4734]: I1206 00:31:31.235545 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5wnj2/crc-debug-nzccb" Dec 06 00:31:31 crc kubenswrapper[4734]: W1206 00:31:31.278451 4734 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf200ff3_145c_4b87_9184_9bcad46ae2f0.slice/crio-e27dbb12b402a257df05fef380a2fe1434567348b107d7d271d931a58868262e WatchSource:0}: Error finding container e27dbb12b402a257df05fef380a2fe1434567348b107d7d271d931a58868262e: Status 404 returned error can't find the container with id e27dbb12b402a257df05fef380a2fe1434567348b107d7d271d931a58868262e Dec 06 00:31:31 crc kubenswrapper[4734]: I1206 00:31:31.647279 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afeca3e5-be11-4cfa-bcbd-4f59eb39aa76" path="/var/lib/kubelet/pods/afeca3e5-be11-4cfa-bcbd-4f59eb39aa76/volumes" Dec 06 00:31:31 crc kubenswrapper[4734]: I1206 00:31:31.712321 4734 generic.go:334] "Generic (PLEG): container finished" podID="af200ff3-145c-4b87-9184-9bcad46ae2f0" containerID="a03a4c803592a78e9078c4307550cedc31c09ad7a85fb0ac51d97766f5eada04" exitCode=0 Dec 06 00:31:31 crc kubenswrapper[4734]: I1206 00:31:31.712412 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5wnj2/crc-debug-nzccb" event={"ID":"af200ff3-145c-4b87-9184-9bcad46ae2f0","Type":"ContainerDied","Data":"a03a4c803592a78e9078c4307550cedc31c09ad7a85fb0ac51d97766f5eada04"} Dec 06 00:31:31 crc kubenswrapper[4734]: I1206 00:31:31.712459 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5wnj2/crc-debug-nzccb" event={"ID":"af200ff3-145c-4b87-9184-9bcad46ae2f0","Type":"ContainerStarted","Data":"e27dbb12b402a257df05fef380a2fe1434567348b107d7d271d931a58868262e"} Dec 06 00:31:31 crc kubenswrapper[4734]: I1206 00:31:31.757176 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5wnj2/crc-debug-nzccb"] Dec 06 00:31:31 crc kubenswrapper[4734]: I1206 00:31:31.766635 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5wnj2/crc-debug-nzccb"] Dec 06 00:31:32 crc kubenswrapper[4734]: I1206 00:31:32.725480 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4zzjd" podUID="b3b374c0-9943-4407-a015-6f596bc75fe1" containerName="registry-server" containerID="cri-o://91e98863e38b023aac3047176d55b0fd49e557ebffd1a88b2264ac9fa46aa04f" gracePeriod=2 Dec 06 00:31:32 crc kubenswrapper[4734]: I1206 00:31:32.942936 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5wnj2/crc-debug-nzccb" Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.101178 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldgtp\" (UniqueName: \"kubernetes.io/projected/af200ff3-145c-4b87-9184-9bcad46ae2f0-kube-api-access-ldgtp\") pod \"af200ff3-145c-4b87-9184-9bcad46ae2f0\" (UID: \"af200ff3-145c-4b87-9184-9bcad46ae2f0\") " Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.102595 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af200ff3-145c-4b87-9184-9bcad46ae2f0-host\") pod \"af200ff3-145c-4b87-9184-9bcad46ae2f0\" (UID: \"af200ff3-145c-4b87-9184-9bcad46ae2f0\") " Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.102782 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af200ff3-145c-4b87-9184-9bcad46ae2f0-host" (OuterVolumeSpecName: "host") pod "af200ff3-145c-4b87-9184-9bcad46ae2f0" (UID: "af200ff3-145c-4b87-9184-9bcad46ae2f0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.103250 4734 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af200ff3-145c-4b87-9184-9bcad46ae2f0-host\") on node \"crc\" DevicePath \"\"" Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.109719 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af200ff3-145c-4b87-9184-9bcad46ae2f0-kube-api-access-ldgtp" (OuterVolumeSpecName: "kube-api-access-ldgtp") pod "af200ff3-145c-4b87-9184-9bcad46ae2f0" (UID: "af200ff3-145c-4b87-9184-9bcad46ae2f0"). InnerVolumeSpecName "kube-api-access-ldgtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.197900 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4zzjd" Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.203698 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3b374c0-9943-4407-a015-6f596bc75fe1-catalog-content\") pod \"b3b374c0-9943-4407-a015-6f596bc75fe1\" (UID: \"b3b374c0-9943-4407-a015-6f596bc75fe1\") " Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.203754 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txtvq\" (UniqueName: \"kubernetes.io/projected/b3b374c0-9943-4407-a015-6f596bc75fe1-kube-api-access-txtvq\") pod \"b3b374c0-9943-4407-a015-6f596bc75fe1\" (UID: \"b3b374c0-9943-4407-a015-6f596bc75fe1\") " Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.203932 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3b374c0-9943-4407-a015-6f596bc75fe1-utilities\") pod \"b3b374c0-9943-4407-a015-6f596bc75fe1\" (UID: \"b3b374c0-9943-4407-a015-6f596bc75fe1\") " Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.204711 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3b374c0-9943-4407-a015-6f596bc75fe1-utilities" (OuterVolumeSpecName: "utilities") pod "b3b374c0-9943-4407-a015-6f596bc75fe1" (UID: "b3b374c0-9943-4407-a015-6f596bc75fe1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.204997 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3b374c0-9943-4407-a015-6f596bc75fe1-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.205010 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldgtp\" (UniqueName: \"kubernetes.io/projected/af200ff3-145c-4b87-9184-9bcad46ae2f0-kube-api-access-ldgtp\") on node \"crc\" DevicePath \"\"" Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.207663 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3b374c0-9943-4407-a015-6f596bc75fe1-kube-api-access-txtvq" (OuterVolumeSpecName: "kube-api-access-txtvq") pod "b3b374c0-9943-4407-a015-6f596bc75fe1" (UID: "b3b374c0-9943-4407-a015-6f596bc75fe1"). InnerVolumeSpecName "kube-api-access-txtvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.264426 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3b374c0-9943-4407-a015-6f596bc75fe1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3b374c0-9943-4407-a015-6f596bc75fe1" (UID: "b3b374c0-9943-4407-a015-6f596bc75fe1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.305971 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txtvq\" (UniqueName: \"kubernetes.io/projected/b3b374c0-9943-4407-a015-6f596bc75fe1-kube-api-access-txtvq\") on node \"crc\" DevicePath \"\"" Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.306026 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3b374c0-9943-4407-a015-6f596bc75fe1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.644944 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af200ff3-145c-4b87-9184-9bcad46ae2f0" path="/var/lib/kubelet/pods/af200ff3-145c-4b87-9184-9bcad46ae2f0/volumes" Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.740334 4734 generic.go:334] "Generic (PLEG): container finished" podID="b3b374c0-9943-4407-a015-6f596bc75fe1" containerID="91e98863e38b023aac3047176d55b0fd49e557ebffd1a88b2264ac9fa46aa04f" exitCode=0 Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.740490 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4zzjd" Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.740480 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zzjd" event={"ID":"b3b374c0-9943-4407-a015-6f596bc75fe1","Type":"ContainerDied","Data":"91e98863e38b023aac3047176d55b0fd49e557ebffd1a88b2264ac9fa46aa04f"} Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.740585 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zzjd" event={"ID":"b3b374c0-9943-4407-a015-6f596bc75fe1","Type":"ContainerDied","Data":"9761f5cdfd6a9dd9b067e2b0775824c98640c92ee7156d4cb4b150192b2f463c"} Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.740616 4734 scope.go:117] "RemoveContainer" containerID="91e98863e38b023aac3047176d55b0fd49e557ebffd1a88b2264ac9fa46aa04f" Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.754740 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5wnj2/crc-debug-nzccb" Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.792113 4734 scope.go:117] "RemoveContainer" containerID="4b2b9ac49ca5e93bbdc06d62e99152072e44438e7b38114ee8d0165fc19751e1" Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.827915 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4zzjd"] Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.834557 4734 scope.go:117] "RemoveContainer" containerID="9c749bf339abd4e99b9cccb7633ae8012dcae20e4de196bc37ef1638343c2b96" Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.843208 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4zzjd"] Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.882949 4734 scope.go:117] "RemoveContainer" containerID="91e98863e38b023aac3047176d55b0fd49e557ebffd1a88b2264ac9fa46aa04f" Dec 06 00:31:33 crc kubenswrapper[4734]: E1206 00:31:33.888932 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91e98863e38b023aac3047176d55b0fd49e557ebffd1a88b2264ac9fa46aa04f\": container with ID starting with 91e98863e38b023aac3047176d55b0fd49e557ebffd1a88b2264ac9fa46aa04f not found: ID does not exist" containerID="91e98863e38b023aac3047176d55b0fd49e557ebffd1a88b2264ac9fa46aa04f" Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.888979 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91e98863e38b023aac3047176d55b0fd49e557ebffd1a88b2264ac9fa46aa04f"} err="failed to get container status \"91e98863e38b023aac3047176d55b0fd49e557ebffd1a88b2264ac9fa46aa04f\": rpc error: code = NotFound desc = could not find container \"91e98863e38b023aac3047176d55b0fd49e557ebffd1a88b2264ac9fa46aa04f\": container with ID starting with 91e98863e38b023aac3047176d55b0fd49e557ebffd1a88b2264ac9fa46aa04f not found: ID does not exist" Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.889013 4734 scope.go:117] "RemoveContainer" containerID="4b2b9ac49ca5e93bbdc06d62e99152072e44438e7b38114ee8d0165fc19751e1" Dec 06 00:31:33 crc kubenswrapper[4734]: E1206 00:31:33.889471 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b2b9ac49ca5e93bbdc06d62e99152072e44438e7b38114ee8d0165fc19751e1\": container with ID starting with 4b2b9ac49ca5e93bbdc06d62e99152072e44438e7b38114ee8d0165fc19751e1 not found: ID does not exist" containerID="4b2b9ac49ca5e93bbdc06d62e99152072e44438e7b38114ee8d0165fc19751e1" Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.889570 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b2b9ac49ca5e93bbdc06d62e99152072e44438e7b38114ee8d0165fc19751e1"} err="failed to get container status \"4b2b9ac49ca5e93bbdc06d62e99152072e44438e7b38114ee8d0165fc19751e1\": rpc error: code = NotFound desc = could not find container \"4b2b9ac49ca5e93bbdc06d62e99152072e44438e7b38114ee8d0165fc19751e1\": container with ID starting with 4b2b9ac49ca5e93bbdc06d62e99152072e44438e7b38114ee8d0165fc19751e1 not found: ID does not exist" Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.889592 4734 scope.go:117] "RemoveContainer" containerID="9c749bf339abd4e99b9cccb7633ae8012dcae20e4de196bc37ef1638343c2b96" Dec 06 00:31:33 crc kubenswrapper[4734]: E1206 00:31:33.889884 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c749bf339abd4e99b9cccb7633ae8012dcae20e4de196bc37ef1638343c2b96\": container with ID starting with 9c749bf339abd4e99b9cccb7633ae8012dcae20e4de196bc37ef1638343c2b96 not found: ID does not exist" containerID="9c749bf339abd4e99b9cccb7633ae8012dcae20e4de196bc37ef1638343c2b96" Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.889917 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c749bf339abd4e99b9cccb7633ae8012dcae20e4de196bc37ef1638343c2b96"} err="failed to get container status \"9c749bf339abd4e99b9cccb7633ae8012dcae20e4de196bc37ef1638343c2b96\": rpc error: code = NotFound desc = could not find container \"9c749bf339abd4e99b9cccb7633ae8012dcae20e4de196bc37ef1638343c2b96\": container with ID starting with 9c749bf339abd4e99b9cccb7633ae8012dcae20e4de196bc37ef1638343c2b96 not found: ID does not exist" Dec 06 00:31:33 crc kubenswrapper[4734]: I1206 00:31:33.889935 4734 scope.go:117] "RemoveContainer" containerID="a03a4c803592a78e9078c4307550cedc31c09ad7a85fb0ac51d97766f5eada04" Dec 06 00:31:35 crc kubenswrapper[4734]: I1206 00:31:35.626053 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3b374c0-9943-4407-a015-6f596bc75fe1" path="/var/lib/kubelet/pods/b3b374c0-9943-4407-a015-6f596bc75fe1/volumes" Dec 06 00:31:50 crc kubenswrapper[4734]: I1206 00:31:50.444624 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 00:31:50 crc kubenswrapper[4734]: I1206 00:31:50.445486 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 00:32:00 crc kubenswrapper[4734]: I1206 00:32:00.784475 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5574c9fdf8-q682b_d3c7aa3a-ca07-4476-8b39-06479afae42d/barbican-api/0.log" Dec 06 00:32:00 crc kubenswrapper[4734]: I1206 00:32:00.963144 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5574c9fdf8-q682b_d3c7aa3a-ca07-4476-8b39-06479afae42d/barbican-api-log/0.log" Dec 06 00:32:01 crc kubenswrapper[4734]: I1206 00:32:01.896713 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5d4f95c8c8-c5lws_0266e747-392d-46c1-bc3e-0ef614db01e3/barbican-keystone-listener/0.log" Dec 06 00:32:01 crc kubenswrapper[4734]: I1206 00:32:01.935149 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5d4f95c8c8-c5lws_0266e747-392d-46c1-bc3e-0ef614db01e3/barbican-keystone-listener-log/0.log" Dec 06 00:32:01 crc kubenswrapper[4734]: I1206 00:32:01.944173 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-57c5555847-t5zf4_b35b4bd8-efbd-4f96-9962-490ea41d44d1/barbican-worker/0.log" Dec 06 00:32:02 crc kubenswrapper[4734]: I1206 00:32:02.145852 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-57c5555847-t5zf4_b35b4bd8-efbd-4f96-9962-490ea41d44d1/barbican-worker-log/0.log" Dec 06 00:32:02 crc kubenswrapper[4734]: I1206 00:32:02.200474 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-ldrpw_faef139d-614e-4c50-a383-8dd231a47b83/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:32:02 crc kubenswrapper[4734]: I1206 00:32:02.400113 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_90a03731-2e0d-4698-a55e-0af3ef5372be/proxy-httpd/0.log" Dec 06 00:32:02 crc kubenswrapper[4734]: I1206 00:32:02.412715 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_90a03731-2e0d-4698-a55e-0af3ef5372be/ceilometer-central-agent/0.log" Dec 06 00:32:02 crc kubenswrapper[4734]: I1206 00:32:02.468045 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_90a03731-2e0d-4698-a55e-0af3ef5372be/sg-core/0.log" Dec 06 00:32:02 crc kubenswrapper[4734]: I1206 00:32:02.483470 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_90a03731-2e0d-4698-a55e-0af3ef5372be/ceilometer-notification-agent/0.log" Dec 06 00:32:02 crc kubenswrapper[4734]: I1206 00:32:02.734480 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_6c79a17a-a1f1-481f-90de-cdcfe632a079/cinder-api-log/0.log" Dec 06 00:32:02 crc kubenswrapper[4734]: I1206 00:32:02.767878 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_6c79a17a-a1f1-481f-90de-cdcfe632a079/cinder-api/0.log" Dec 06 00:32:02 crc kubenswrapper[4734]: I1206 00:32:02.950764 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-db-purge-29416321-tqhwd_3b9dc1e0-8ec1-49c3-8e94-14dc3f07f124/cinder-db-purge/0.log" Dec 06 00:32:03 crc kubenswrapper[4734]: I1206 00:32:03.081623 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d36cae72-5806-4d9c-80a9-c396c5ca00d6/probe/0.log" Dec 06 00:32:03 crc kubenswrapper[4734]: I1206 00:32:03.099615 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d36cae72-5806-4d9c-80a9-c396c5ca00d6/cinder-scheduler/0.log" Dec 06 00:32:03 crc kubenswrapper[4734]: I1206 00:32:03.335243 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-xl5qq_f183bc38-e046-45f6-b96a-440e596c8088/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:32:03 crc kubenswrapper[4734]: I1206 00:32:03.367541 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-hwrl9_6de30094-9f75-467b-a935-3abbdf98e94c/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:32:03 crc kubenswrapper[4734]: I1206 00:32:03.520834 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-p7nf9_83b046ba-a4ad-4e9b-b266-a23db4ef72ae/init/0.log" Dec 06 00:32:03 crc kubenswrapper[4734]: I1206 00:32:03.737403 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-p7nf9_83b046ba-a4ad-4e9b-b266-a23db4ef72ae/init/0.log" Dec 06 00:32:03 crc kubenswrapper[4734]: I1206 00:32:03.815069 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-b8d9k_b881d911-43a8-4290-98e8-89e268e162e4/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:32:03 crc kubenswrapper[4734]: I1206 00:32:03.860959 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-p7nf9_83b046ba-a4ad-4e9b-b266-a23db4ef72ae/dnsmasq-dns/0.log" Dec 06 00:32:04 crc kubenswrapper[4734]: I1206 00:32:04.076076 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-purge-29416321-zr2lx_e30aee89-812f-4e60-997e-54de845b7afe/glance-dbpurge/0.log" Dec 06 00:32:04 crc kubenswrapper[4734]: I1206 00:32:04.175682 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_eddf1584-198a-4279-a09a-30500f1842f3/glance-httpd/0.log" Dec 06 00:32:04 crc kubenswrapper[4734]: I1206 00:32:04.232906 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_eddf1584-198a-4279-a09a-30500f1842f3/glance-log/0.log" Dec 06 00:32:04 crc kubenswrapper[4734]: I1206 00:32:04.381249 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d6b10458-86e7-4568-b3b5-2a3e090b90a8/glance-httpd/0.log" Dec 06 00:32:04 crc kubenswrapper[4734]: I1206 00:32:04.433069 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d6b10458-86e7-4568-b3b5-2a3e090b90a8/glance-log/0.log" Dec 06 00:32:04 crc kubenswrapper[4734]: I1206 00:32:04.562458 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-755fc898d8-dlnbz_bbcbbde9-55c9-48dc-866d-ab670775e9b3/horizon/0.log" Dec 06 00:32:04 crc kubenswrapper[4734]: I1206 00:32:04.740438 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-mjjrv_131ed9d5-6ee3-41f4-9e7f-400cd4c0fe98/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:32:04 crc kubenswrapper[4734]: I1206 00:32:04.939815 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-lqrnf_43caeb9a-1d22-41be-abb1-48b4881e6afb/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:32:04 crc kubenswrapper[4734]: I1206 00:32:04.995315 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29416321-nx5mh_0e2a8f39-3819-46e4-9f5c-b2378637486f/keystone-cron/0.log" Dec 06 00:32:05 crc kubenswrapper[4734]: I1206 00:32:05.014783 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-755fc898d8-dlnbz_bbcbbde9-55c9-48dc-866d-ab670775e9b3/horizon-log/0.log" Dec 06 00:32:05 crc kubenswrapper[4734]: I1206 00:32:05.251309 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_e4c76f3a-43a9-43fc-be28-d7d3081d5e39/kube-state-metrics/0.log" Dec 06 00:32:05 crc kubenswrapper[4734]: I1206 00:32:05.321720 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-fffd48d8f-srcmr_26447265-57c1-45c6-bbef-cf7b2a82ed85/keystone-api/0.log" Dec 06 00:32:05 crc kubenswrapper[4734]: I1206 00:32:05.471348 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-fnfsk_85f32997-f801-4f60-b010-aaff637a8292/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:32:05 crc kubenswrapper[4734]: I1206 00:32:05.720831 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67d74d57d5-4s4p7_f4201381-aab2-40da-9f4a-dc31e8874266/neutron-api/0.log" Dec 06 00:32:05 crc kubenswrapper[4734]: I1206 00:32:05.850256 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67d74d57d5-4s4p7_f4201381-aab2-40da-9f4a-dc31e8874266/neutron-httpd/0.log" Dec 06 00:32:05 crc kubenswrapper[4734]: I1206 00:32:05.954889 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs7l2_e4c89d06-2d3b-47f8-bc2e-fa34a9d89453/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:32:06 crc kubenswrapper[4734]: I1206 00:32:06.581587 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_fe37850d-71e6-4310-9c74-b98b792cecc4/nova-api-log/0.log" Dec 06 00:32:06 crc kubenswrapper[4734]: I1206 00:32:06.651543 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_97634e74-2d01-49ae-b584-650725749027/nova-cell0-conductor-conductor/0.log" Dec 06 00:32:06 crc kubenswrapper[4734]: I1206 00:32:06.749240 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-db-purge-29416320-kjz69_e0efe4eb-7e90-4ea7-8c6a-d3c95c3845a5/nova-manage/0.log" Dec 06 00:32:07 crc kubenswrapper[4734]: I1206 00:32:07.131897 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-db-purge-29416320-x2nsf_1d498a8e-4ace-4a26-9c32-2dbc411c0b50/nova-manage/0.log" Dec 06 00:32:07 crc kubenswrapper[4734]: I1206 00:32:07.143666 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_b801f420-78a0-4564-9339-fca1170a01d7/nova-cell1-conductor-conductor/0.log" Dec 06 00:32:07 crc kubenswrapper[4734]: I1206 00:32:07.175352 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_fe37850d-71e6-4310-9c74-b98b792cecc4/nova-api-api/0.log" Dec 06 00:32:07 crc kubenswrapper[4734]: I1206 00:32:07.524541 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-l7kts_7d966291-cd7e-47ce-a95e-bee879371108/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:32:07 crc kubenswrapper[4734]: I1206 00:32:07.591123 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_41bee178-e2d7-4047-9c0a-429dc21411ed/nova-cell1-novncproxy-novncproxy/0.log" Dec 06 00:32:07 crc kubenswrapper[4734]: I1206 00:32:07.769366 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_6929b8c5-4cb9-49cd-a084-d578657ce0bf/nova-metadata-log/0.log" Dec 06 00:32:08 crc kubenswrapper[4734]: I1206 00:32:08.333638 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3cc9e4dc-431f-4963-911b-f6262ac3c6b5/mysql-bootstrap/0.log" Dec 06 00:32:08 crc kubenswrapper[4734]: I1206 00:32:08.421764 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_77364fbf-3dbe-45c3-adf1-94410f61f0ce/nova-scheduler-scheduler/0.log" Dec 06 00:32:08 crc kubenswrapper[4734]: I1206 00:32:08.593625 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3cc9e4dc-431f-4963-911b-f6262ac3c6b5/mysql-bootstrap/0.log" Dec 06 00:32:08 crc kubenswrapper[4734]: I1206 00:32:08.618593 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3cc9e4dc-431f-4963-911b-f6262ac3c6b5/galera/0.log" Dec 06 00:32:08 crc kubenswrapper[4734]: I1206 00:32:08.793168 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9fd725c7-f12a-4504-a71d-46e7d0258af7/mysql-bootstrap/0.log" Dec 06 00:32:09 crc kubenswrapper[4734]: I1206 00:32:09.060283 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9fd725c7-f12a-4504-a71d-46e7d0258af7/galera/0.log" Dec 06 00:32:09 crc kubenswrapper[4734]: I1206 00:32:09.069105 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9fd725c7-f12a-4504-a71d-46e7d0258af7/mysql-bootstrap/0.log" Dec 06 00:32:09 crc kubenswrapper[4734]: I1206 00:32:09.312276 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-587wk_625f2253-5867-4d61-a436-264a79c0bd94/ovn-controller/0.log" Dec 06 00:32:09 crc kubenswrapper[4734]: I1206 00:32:09.332281 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_1538ece1-e24d-4f20-b92d-0b526d1f5698/openstackclient/0.log" Dec 06 00:32:09 crc kubenswrapper[4734]: I1206 00:32:09.524562 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-cpzs4_b246fed6-9a79-4d72-a73a-943b13d8e30b/openstack-network-exporter/0.log" Dec 06 00:32:09 crc kubenswrapper[4734]: I1206 00:32:09.783189 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tpdrq_9631bcf5-05df-4e1d-b849-7352ef35013f/ovsdb-server-init/0.log" Dec 06 00:32:09 crc kubenswrapper[4734]: I1206 00:32:09.857495 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_6929b8c5-4cb9-49cd-a084-d578657ce0bf/nova-metadata-metadata/0.log" Dec 06 00:32:09 crc kubenswrapper[4734]: I1206 00:32:09.974197 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tpdrq_9631bcf5-05df-4e1d-b849-7352ef35013f/ovsdb-server-init/0.log" Dec 06 00:32:10 crc kubenswrapper[4734]: I1206 00:32:10.085599 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tpdrq_9631bcf5-05df-4e1d-b849-7352ef35013f/ovsdb-server/0.log" Dec 06 00:32:10 crc kubenswrapper[4734]: I1206 00:32:10.088291 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tpdrq_9631bcf5-05df-4e1d-b849-7352ef35013f/ovs-vswitchd/0.log" Dec 06 00:32:10 crc kubenswrapper[4734]: I1206 00:32:10.221871 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-dxg77_4b772014-ade2-4ef1-9795-8a6eb255f57f/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:32:10 crc kubenswrapper[4734]: I1206 00:32:10.339101 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_bf6b4283-12e2-489b-9808-9b4f21a2c080/openstack-network-exporter/0.log" Dec 06 00:32:10 crc kubenswrapper[4734]: I1206 00:32:10.476835 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_bf6b4283-12e2-489b-9808-9b4f21a2c080/ovn-northd/0.log" Dec 06 00:32:10 crc kubenswrapper[4734]: I1206 00:32:10.600583 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3fceddd4-e096-4a7e-875f-756279962334/ovsdbserver-nb/0.log" Dec 06 00:32:10 crc kubenswrapper[4734]: I1206 00:32:10.651739 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3fceddd4-e096-4a7e-875f-756279962334/openstack-network-exporter/0.log" Dec 06 00:32:10 crc kubenswrapper[4734]: I1206 00:32:10.796712 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350/openstack-network-exporter/0.log" Dec 06 00:32:11 crc kubenswrapper[4734]: I1206 00:32:11.536203 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2f3dcdbf-2c38-4e2a-9420-c2d7f9b75350/ovsdbserver-sb/0.log" Dec 06 00:32:11 crc kubenswrapper[4734]: I1206 00:32:11.615273 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-85fbcb99c8-4gdvt_ccbbcfb6-1ffd-4c8e-8945-9d496467e46a/placement-api/0.log" Dec 06 00:32:11 crc kubenswrapper[4734]: I1206 00:32:11.724710 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-85fbcb99c8-4gdvt_ccbbcfb6-1ffd-4c8e-8945-9d496467e46a/placement-log/0.log" Dec 06 00:32:11 crc kubenswrapper[4734]: I1206 00:32:11.853133 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_34a9d7ac-2a42-4352-8eb3-23d34cfc5696/setup-container/0.log" Dec 06 00:32:12 crc kubenswrapper[4734]: I1206 00:32:12.108112 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_34a9d7ac-2a42-4352-8eb3-23d34cfc5696/setup-container/0.log" Dec 06 00:32:12 crc kubenswrapper[4734]: I1206 00:32:12.126707 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_556dbce3-075c-473a-ab0d-ea67ffc3e144/setup-container/0.log" Dec 06 00:32:12 crc kubenswrapper[4734]: I1206 00:32:12.151633 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_34a9d7ac-2a42-4352-8eb3-23d34cfc5696/rabbitmq/0.log" Dec 06 00:32:12 crc kubenswrapper[4734]: I1206 00:32:12.403853 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_556dbce3-075c-473a-ab0d-ea67ffc3e144/setup-container/0.log" Dec 06 00:32:12 crc kubenswrapper[4734]: I1206 00:32:12.499289 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_556dbce3-075c-473a-ab0d-ea67ffc3e144/rabbitmq/0.log" Dec 06 00:32:12 crc kubenswrapper[4734]: I1206 00:32:12.527008 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-lz74w_b9d39a80-01a8-421a-afac-94171314c0e1/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:32:12 crc kubenswrapper[4734]: I1206 00:32:12.721115 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-z9lz9_29e8f09f-ca59-420f-ae3c-8bdb696d653a/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:32:12 crc kubenswrapper[4734]: I1206 00:32:12.810767 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-8lq4p_12cd9906-9f9f-42ba-8869-54f39ae29366/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:32:12 crc kubenswrapper[4734]: I1206 00:32:12.973481 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-pmm9q_378f4ff2-7e86-40ca-b771-155a02f5cb45/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:32:13 crc kubenswrapper[4734]: I1206 00:32:13.089030 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-mtqlw_d16abc61-9f6e-4980-9821-af436f2501fe/ssh-known-hosts-edpm-deployment/0.log" Dec 06 00:32:13 crc kubenswrapper[4734]: I1206 00:32:13.830246 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-qdl57_a1e03821-b44b-4ce9-8fb9-6831bf8b087f/swift-ring-rebalance/0.log" Dec 06 00:32:13 crc kubenswrapper[4734]: I1206 00:32:13.846027 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-66674dc5bc-l642k_d955842c-e3a2-4a05-a380-78c6f2fbdf3b/proxy-server/0.log" Dec 06 00:32:14 crc kubenswrapper[4734]: I1206 00:32:14.025642 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-66674dc5bc-l642k_d955842c-e3a2-4a05-a380-78c6f2fbdf3b/proxy-httpd/0.log" Dec 06 00:32:14 crc kubenswrapper[4734]: I1206 00:32:14.104168 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/account-auditor/0.log" Dec 06 00:32:14 crc kubenswrapper[4734]: I1206 00:32:14.115742 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/account-reaper/0.log" Dec 06 00:32:14 crc kubenswrapper[4734]: I1206 00:32:14.307227 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/account-replicator/0.log" Dec 06 00:32:14 crc kubenswrapper[4734]: I1206 00:32:14.354699 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/account-server/0.log" Dec 06 00:32:14 crc kubenswrapper[4734]: I1206 00:32:14.385584 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/container-auditor/0.log" Dec 06 00:32:14 crc kubenswrapper[4734]: I1206 00:32:14.506190 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/container-server/0.log" Dec 06 00:32:14 crc kubenswrapper[4734]: I1206 00:32:14.586471 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/container-replicator/0.log" Dec 06 00:32:14 crc kubenswrapper[4734]: I1206 00:32:14.627965 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/container-updater/0.log" Dec 06 00:32:14 crc kubenswrapper[4734]: I1206 00:32:14.723660 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/object-auditor/0.log" Dec 06 00:32:14 crc kubenswrapper[4734]: I1206 00:32:14.757769 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/object-expirer/0.log" Dec 06 00:32:14 crc kubenswrapper[4734]: I1206 00:32:14.861003 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/object-replicator/0.log" Dec 06 00:32:14 crc kubenswrapper[4734]: I1206 00:32:14.914482 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/object-server/0.log" Dec 06 00:32:14 crc kubenswrapper[4734]: I1206 00:32:14.988048 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/rsync/0.log" Dec 06 00:32:15 crc kubenswrapper[4734]: I1206 00:32:15.004075 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/object-updater/0.log" Dec 06 00:32:15 crc kubenswrapper[4734]: I1206 00:32:15.141895 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fea25d07-8cbc-4875-89e8-1752b0ee2a9e/swift-recon-cron/0.log" Dec 06 00:32:15 crc kubenswrapper[4734]: I1206 00:32:15.268913 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-6m5nb_039811b0-a938-445d-b5a4-702b526f8356/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:32:15 crc kubenswrapper[4734]: I1206 00:32:15.446107 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_5d24dfd1-9ec6-4419-84c6-577deb60b95f/tempest-tests-tempest-tests-runner/0.log" Dec 06 00:32:15 crc kubenswrapper[4734]: I1206 00:32:15.554219 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_e615b54d-cff3-4de2-8569-9c492e2234e0/test-operator-logs-container/0.log" Dec 06 00:32:15 crc kubenswrapper[4734]: I1206 00:32:15.714019 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-45qzx_cc15ec12-e046-4933-beec-886e0868c644/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 06 00:32:20 crc kubenswrapper[4734]: I1206 00:32:20.444513 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 00:32:20 crc kubenswrapper[4734]: I1206 00:32:20.445322 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 00:32:25 crc kubenswrapper[4734]: I1206 00:32:25.859428 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_61801e1d-6a79-497f-822b-69b683c2f78b/memcached/0.log" Dec 06 00:32:44 crc kubenswrapper[4734]: I1206 00:32:44.809676 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x_d6a5b5d0-ee84-4715-8024-25698133af6b/util/0.log" Dec 06 00:32:44 crc kubenswrapper[4734]: I1206 00:32:44.992145 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x_d6a5b5d0-ee84-4715-8024-25698133af6b/util/0.log" Dec 06 00:32:45 crc kubenswrapper[4734]: I1206 00:32:45.062473 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x_d6a5b5d0-ee84-4715-8024-25698133af6b/pull/0.log" Dec 06 00:32:45 crc kubenswrapper[4734]: I1206 00:32:45.073979 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x_d6a5b5d0-ee84-4715-8024-25698133af6b/pull/0.log" Dec 06 00:32:45 crc kubenswrapper[4734]: I1206 00:32:45.243333 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x_d6a5b5d0-ee84-4715-8024-25698133af6b/util/0.log" Dec 06 00:32:45 crc kubenswrapper[4734]: I1206 00:32:45.255353 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x_d6a5b5d0-ee84-4715-8024-25698133af6b/pull/0.log" Dec 06 00:32:45 crc kubenswrapper[4734]: I1206 00:32:45.277777 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_15268b67d17595ca1643fa87cbefd21f8c840d78fb4a95ff1703f57695rzq9x_d6a5b5d0-ee84-4715-8024-25698133af6b/extract/0.log" Dec 06 00:32:45 crc kubenswrapper[4734]: I1206 00:32:45.472065 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-bl5vb_4ac00d0e-d1c1-44d8-869d-1d98f5a137e0/kube-rbac-proxy/0.log" Dec 06 00:32:45 crc kubenswrapper[4734]: I1206 00:32:45.534657 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-bl5vb_4ac00d0e-d1c1-44d8-869d-1d98f5a137e0/manager/0.log" Dec 06 00:32:45 crc kubenswrapper[4734]: I1206 00:32:45.582462 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-4clqb_6ba0bb79-4132-4bd9-a2ce-c8a9b516402d/kube-rbac-proxy/0.log" Dec 06 00:32:45 crc kubenswrapper[4734]: I1206 00:32:45.770425 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-4clqb_6ba0bb79-4132-4bd9-a2ce-c8a9b516402d/manager/0.log" Dec 06 00:32:45 crc kubenswrapper[4734]: I1206 00:32:45.809987 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-fwbcd_157817be-876f-4157-87af-6ef317b91cb9/kube-rbac-proxy/0.log" Dec 06 00:32:45 crc kubenswrapper[4734]: I1206 00:32:45.861119 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-fwbcd_157817be-876f-4157-87af-6ef317b91cb9/manager/0.log" Dec 06 00:32:46 crc kubenswrapper[4734]: I1206 00:32:46.072882 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-rp9j7_c12a23f4-fdd7-455e-b74c-f757f15990ca/kube-rbac-proxy/0.log" Dec 06 00:32:46 crc kubenswrapper[4734]: I1206 00:32:46.151354 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-rp9j7_c12a23f4-fdd7-455e-b74c-f757f15990ca/manager/0.log" Dec 06 00:32:46 crc kubenswrapper[4734]: I1206 00:32:46.243610 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-c7c94_4685a9c2-ef1c-462d-848c-fbbea6a8ebfe/kube-rbac-proxy/0.log" Dec 06 00:32:46 crc kubenswrapper[4734]: I1206 00:32:46.321708 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-c7c94_4685a9c2-ef1c-462d-848c-fbbea6a8ebfe/manager/0.log" Dec 06 00:32:46 crc kubenswrapper[4734]: I1206 00:32:46.460154 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-mnmlv_9a792918-0311-4b1b-8920-a315370ecba7/kube-rbac-proxy/0.log" Dec 06 00:32:46 crc kubenswrapper[4734]: I1206 00:32:46.480206 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-mnmlv_9a792918-0311-4b1b-8920-a315370ecba7/manager/0.log" Dec 06 00:32:46 crc kubenswrapper[4734]: I1206 00:32:46.676095 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-r2427_df5aaec7-4487-47a1-98c4-0206d0ecf7f4/kube-rbac-proxy/0.log" Dec 06 00:32:46 crc kubenswrapper[4734]: I1206 00:32:46.821339 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-rw8vg_ef794353-3292-4809-94d8-105aaa36889e/kube-rbac-proxy/0.log" Dec 06 00:32:46 crc kubenswrapper[4734]: I1206 00:32:46.910118 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-r2427_df5aaec7-4487-47a1-98c4-0206d0ecf7f4/manager/0.log" Dec 06 00:32:46 crc kubenswrapper[4734]: I1206 00:32:46.963183 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-rw8vg_ef794353-3292-4809-94d8-105aaa36889e/manager/0.log" Dec 06 00:32:47 crc kubenswrapper[4734]: I1206 00:32:47.094036 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-t5gsd_3255ef71-c5a8-4fef-a1ab-dc2107c710eb/kube-rbac-proxy/0.log" Dec 06 00:32:47 crc kubenswrapper[4734]: I1206 00:32:47.187337 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-t5gsd_3255ef71-c5a8-4fef-a1ab-dc2107c710eb/manager/0.log" Dec 06 00:32:47 crc kubenswrapper[4734]: I1206 00:32:47.304084 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-lwpjm_9883e2bb-76f7-476d-8a74-e358ebf37ed2/kube-rbac-proxy/0.log" Dec 06 00:32:47 crc kubenswrapper[4734]: I1206 00:32:47.413161 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-lwpjm_9883e2bb-76f7-476d-8a74-e358ebf37ed2/manager/0.log" Dec 06 00:32:47 crc kubenswrapper[4734]: I1206 00:32:47.413403 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-nc6wd_608bca6a-1cb5-44b9-91c6-32a77372a4e5/kube-rbac-proxy/0.log" Dec 06 00:32:47 crc kubenswrapper[4734]: I1206 00:32:47.563758 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-nc6wd_608bca6a-1cb5-44b9-91c6-32a77372a4e5/manager/0.log" Dec 06 00:32:47 crc kubenswrapper[4734]: I1206 00:32:47.652023 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-zx66z_58aa2c14-9374-45b1-b6dd-07e849f23306/kube-rbac-proxy/0.log" Dec 06 00:32:47 crc kubenswrapper[4734]: I1206 00:32:47.812722 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-zx66z_58aa2c14-9374-45b1-b6dd-07e849f23306/manager/0.log" Dec 06 00:32:47 crc kubenswrapper[4734]: I1206 00:32:47.874836 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-wf4vr_aa5ccaa9-5087-4891-b255-a5135271a2a5/kube-rbac-proxy/0.log" Dec 06 00:32:47 crc kubenswrapper[4734]: I1206 00:32:47.990178 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-wf4vr_aa5ccaa9-5087-4891-b255-a5135271a2a5/manager/0.log" Dec 06 00:32:48 crc kubenswrapper[4734]: I1206 00:32:48.106720 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-bf28l_3ab5c543-f1e6-455c-a051-7940ffcc833d/kube-rbac-proxy/0.log" Dec 06 00:32:48 crc kubenswrapper[4734]: I1206 00:32:48.156753 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-bf28l_3ab5c543-f1e6-455c-a051-7940ffcc833d/manager/0.log" Dec 06 00:32:48 crc kubenswrapper[4734]: I1206 00:32:48.325286 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879fhnqmj_9cce8abe-4425-4cea-ac4f-3fd707bd5737/kube-rbac-proxy/0.log" Dec 06 00:32:48 crc kubenswrapper[4734]: I1206 00:32:48.365354 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879fhnqmj_9cce8abe-4425-4cea-ac4f-3fd707bd5737/manager/0.log" Dec 06 00:32:48 crc kubenswrapper[4734]: I1206 00:32:48.871584 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-c7wgh_99bde61f-d552-4013-b4fc-eb55e428f53b/registry-server/0.log" Dec 06 00:32:48 crc kubenswrapper[4734]: I1206 00:32:48.908664 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-58b957df85-wbwkx_9ef6c4e9-8341-489c-9f21-ffda1c3ef34a/operator/0.log" Dec 06 00:32:49 crc kubenswrapper[4734]: I1206 00:32:49.003896 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-hdt6h_ea29b614-e490-4a3e-925e-d9f6c56b0c35/kube-rbac-proxy/0.log" Dec 06 00:32:49 crc kubenswrapper[4734]: I1206 00:32:49.165847 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-hdt6h_ea29b614-e490-4a3e-925e-d9f6c56b0c35/manager/0.log" Dec 06 00:32:49 crc kubenswrapper[4734]: I1206 00:32:49.260815 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-w8l4b_ad6bda6e-964f-44c3-b759-ad151097b4f1/kube-rbac-proxy/0.log" Dec 06 00:32:49 crc kubenswrapper[4734]: I1206 00:32:49.320158 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-w8l4b_ad6bda6e-964f-44c3-b759-ad151097b4f1/manager/0.log" Dec 06 00:32:49 crc kubenswrapper[4734]: I1206 00:32:49.459331 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-4bt9z_43c8ec4c-96f9-47f0-9313-2813ea1c62c2/operator/0.log" Dec 06 00:32:49 crc kubenswrapper[4734]: I1206 00:32:49.560718 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-cdtjx_2050fd66-c55a-4048-a869-cb786b5f0d2b/kube-rbac-proxy/0.log" Dec 06 00:32:49 crc kubenswrapper[4734]: I1206 00:32:49.747013 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-cdtjx_2050fd66-c55a-4048-a869-cb786b5f0d2b/manager/0.log" Dec 06 00:32:49 crc kubenswrapper[4734]: I1206 00:32:49.933310 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5845f76896-vhzwq_2bbe7e3e-6a43-4bf8-b8f4-3d88416cd32e/manager/0.log" Dec 06 00:32:50 crc kubenswrapper[4734]: I1206 00:32:50.013794 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-2fm4z_696f07ba-7c46-41f2-826f-890756824285/kube-rbac-proxy/0.log" Dec 06 00:32:50 crc kubenswrapper[4734]: I1206 00:32:50.052966 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-2fm4z_696f07ba-7c46-41f2-826f-890756824285/manager/0.log" Dec 06 00:32:50 crc kubenswrapper[4734]: I1206 00:32:50.116989 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-j4m2j_974bff7e-6bfc-49c2-9d3d-831d1bf5385d/kube-rbac-proxy/0.log" Dec 06 00:32:50 crc kubenswrapper[4734]: I1206 00:32:50.170075 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-j4m2j_974bff7e-6bfc-49c2-9d3d-831d1bf5385d/manager/0.log" Dec 06 00:32:50 crc kubenswrapper[4734]: I1206 00:32:50.263425 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-667bd8d554-xwqfj_b7ee6df9-99e2-480d-aa84-7618ff0cda2f/kube-rbac-proxy/0.log" Dec 06 00:32:50 crc kubenswrapper[4734]: I1206 00:32:50.376716 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-667bd8d554-xwqfj_b7ee6df9-99e2-480d-aa84-7618ff0cda2f/manager/0.log" Dec 06 00:32:50 crc kubenswrapper[4734]: I1206 00:32:50.444799 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 00:32:50 crc kubenswrapper[4734]: I1206 00:32:50.444898 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 00:32:50 crc kubenswrapper[4734]: I1206 00:32:50.444953 4734 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" Dec 06 00:32:50 crc kubenswrapper[4734]: I1206 00:32:50.446050 4734 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d"} pod="openshift-machine-config-operator/machine-config-daemon-vn94d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 06 00:32:50 crc kubenswrapper[4734]: I1206 00:32:50.446120 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" containerID="cri-o://defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d" gracePeriod=600 Dec 06 00:32:50 crc kubenswrapper[4734]: E1206 00:32:50.572402 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:32:50 crc kubenswrapper[4734]: I1206 00:32:50.672741 4734 generic.go:334] "Generic (PLEG): container finished" podID="65758270-a7a7-46b5-af95-0588daf9fa86" containerID="defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d" exitCode=0 Dec 06 00:32:50 crc kubenswrapper[4734]: I1206 00:32:50.672823 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerDied","Data":"defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d"} Dec 06 00:32:50 crc kubenswrapper[4734]: I1206 00:32:50.672899 4734 scope.go:117] "RemoveContainer" containerID="81278da5539b6ec2789a9334b6c891b6b36cb63a7e5b4a031f5c2f40b60a134e" Dec 06 00:32:50 crc kubenswrapper[4734]: I1206 00:32:50.673905 4734 scope.go:117] "RemoveContainer" containerID="defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d" Dec 06 00:32:50 crc kubenswrapper[4734]: E1206 00:32:50.674237 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:33:01 crc kubenswrapper[4734]: I1206 00:33:01.615872 4734 scope.go:117] "RemoveContainer" containerID="defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d" Dec 06 00:33:01 crc kubenswrapper[4734]: E1206 00:33:01.616995 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:33:14 crc kubenswrapper[4734]: I1206 00:33:14.410457 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-qvhvd_7b7cfcc6-ee0e-4af5-9e03-5d8cbb40edbb/control-plane-machine-set-operator/0.log" Dec 06 00:33:14 crc kubenswrapper[4734]: I1206 00:33:14.623712 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-j6jsf_74a8397f-0607-4761-9fc5-77e9a6d197c8/machine-api-operator/0.log" Dec 06 00:33:14 crc kubenswrapper[4734]: I1206 00:33:14.624056 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-j6jsf_74a8397f-0607-4761-9fc5-77e9a6d197c8/kube-rbac-proxy/0.log" Dec 06 00:33:15 crc kubenswrapper[4734]: I1206 00:33:15.614722 4734 scope.go:117] "RemoveContainer" containerID="defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d" Dec 06 00:33:15 crc kubenswrapper[4734]: E1206 00:33:15.615515 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:33:28 crc kubenswrapper[4734]: I1206 00:33:28.615518 4734 scope.go:117] "RemoveContainer" containerID="defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d" Dec 06 00:33:28 crc kubenswrapper[4734]: E1206 00:33:28.616640 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:33:29 crc kubenswrapper[4734]: I1206 00:33:29.541458 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-hnchl_77b0debe-a9d9-495d-baf3-e5ad3c05541a/cert-manager-controller/0.log" Dec 06 00:33:29 crc kubenswrapper[4734]: I1206 00:33:29.692472 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-j92m9_4806cf35-7fd8-4044-8618-8e573c476375/cert-manager-cainjector/0.log" Dec 06 00:33:29 crc kubenswrapper[4734]: I1206 00:33:29.786288 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-pffbx_4aa5e323-62f7-491b-a47e-747b2d32cfc5/cert-manager-webhook/0.log" Dec 06 00:33:42 crc kubenswrapper[4734]: I1206 00:33:42.616507 4734 scope.go:117] "RemoveContainer" containerID="defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d" Dec 06 00:33:42 crc kubenswrapper[4734]: E1206 00:33:42.618194 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:33:45 crc kubenswrapper[4734]: I1206 00:33:45.198644 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-cr2t4_238e4a30-5ad1-4948-b27f-41e096f3095a/nmstate-console-plugin/0.log" Dec 06 00:33:45 crc kubenswrapper[4734]: I1206 00:33:45.457086 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-x728q_ab1d36f1-0fc8-4ad6-8725-799c1838b033/nmstate-handler/0.log" Dec 06 00:33:45 crc kubenswrapper[4734]: I1206 00:33:45.539509 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-6p9xx_408e85a8-5bd9-4c30-bd55-5262e3a2aa24/kube-rbac-proxy/0.log" Dec 06 00:33:45 crc kubenswrapper[4734]: I1206 00:33:45.610966 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-6p9xx_408e85a8-5bd9-4c30-bd55-5262e3a2aa24/nmstate-metrics/0.log" Dec 06 00:33:46 crc kubenswrapper[4734]: I1206 00:33:46.067422 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-zxqpc_6bf99a15-c582-4a10-a26f-252c1c870f55/nmstate-webhook/0.log" Dec 06 00:33:46 crc kubenswrapper[4734]: I1206 00:33:46.076468 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-gxn6v_7b52ad0f-5f7e-4691-be39-ac2f121bb909/nmstate-operator/0.log" Dec 06 00:33:55 crc kubenswrapper[4734]: I1206 00:33:55.614922 4734 scope.go:117] "RemoveContainer" containerID="defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d" Dec 06 00:33:55 crc kubenswrapper[4734]: E1206 00:33:55.615967 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:34:03 crc kubenswrapper[4734]: I1206 00:34:03.718109 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-cpg9k_ac2c5d10-25e3-4d0e-9632-ee5701c15e7e/kube-rbac-proxy/0.log" Dec 06 00:34:03 crc kubenswrapper[4734]: I1206 00:34:03.817122 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-cpg9k_ac2c5d10-25e3-4d0e-9632-ee5701c15e7e/controller/0.log" Dec 06 00:34:03 crc kubenswrapper[4734]: I1206 00:34:03.967101 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/cp-frr-files/0.log" Dec 06 00:34:04 crc kubenswrapper[4734]: I1206 00:34:04.147631 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/cp-reloader/0.log" Dec 06 00:34:04 crc kubenswrapper[4734]: I1206 00:34:04.178516 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/cp-frr-files/0.log" Dec 06 00:34:04 crc kubenswrapper[4734]: I1206 00:34:04.228163 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/cp-metrics/0.log" Dec 06 00:34:04 crc kubenswrapper[4734]: I1206 00:34:04.253253 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/cp-reloader/0.log" Dec 06 00:34:04 crc kubenswrapper[4734]: I1206 00:34:04.475350 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/cp-frr-files/0.log" Dec 06 00:34:04 crc kubenswrapper[4734]: I1206 00:34:04.517005 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/cp-reloader/0.log" Dec 06 00:34:04 crc kubenswrapper[4734]: I1206 00:34:04.550334 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/cp-metrics/0.log" Dec 06 00:34:04 crc kubenswrapper[4734]: I1206 00:34:04.580979 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/cp-metrics/0.log" Dec 06 00:34:04 crc kubenswrapper[4734]: I1206 00:34:04.736617 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/cp-reloader/0.log" Dec 06 00:34:04 crc kubenswrapper[4734]: I1206 00:34:04.781858 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/cp-metrics/0.log" Dec 06 00:34:04 crc kubenswrapper[4734]: I1206 00:34:04.801092 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/controller/0.log" Dec 06 00:34:04 crc kubenswrapper[4734]: I1206 00:34:04.808426 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/cp-frr-files/0.log" Dec 06 00:34:05 crc kubenswrapper[4734]: I1206 00:34:05.043085 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/kube-rbac-proxy/0.log" Dec 06 00:34:05 crc kubenswrapper[4734]: I1206 00:34:05.065746 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/kube-rbac-proxy-frr/0.log" Dec 06 00:34:05 crc kubenswrapper[4734]: I1206 00:34:05.075075 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/frr-metrics/0.log" Dec 06 00:34:05 crc kubenswrapper[4734]: I1206 00:34:05.316952 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/reloader/0.log" Dec 06 00:34:05 crc kubenswrapper[4734]: I1206 00:34:05.319257 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-g87xk_80152489-1b48-4b06-8684-983081b45f88/frr-k8s-webhook-server/0.log" Dec 06 00:34:05 crc kubenswrapper[4734]: I1206 00:34:05.651742 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5b69785d4f-gksx6_df912953-69c4-4841-abb5-afa544bd8df7/manager/0.log" Dec 06 00:34:05 crc kubenswrapper[4734]: I1206 00:34:05.834030 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-8688474b6d-2dhr7_97e5de92-85a3-4262-a82f-5b7195d72a9c/webhook-server/0.log" Dec 06 00:34:05 crc kubenswrapper[4734]: I1206 00:34:05.899924 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-csdv2_0b07126f-ef86-48d5-b597-56782b518f5e/kube-rbac-proxy/0.log" Dec 06 00:34:06 crc kubenswrapper[4734]: I1206 00:34:06.592602 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-csdv2_0b07126f-ef86-48d5-b597-56782b518f5e/speaker/0.log" Dec 06 00:34:06 crc kubenswrapper[4734]: I1206 00:34:06.614156 4734 scope.go:117] "RemoveContainer" containerID="defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d" Dec 06 00:34:06 crc kubenswrapper[4734]: E1206 00:34:06.614436 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:34:06 crc kubenswrapper[4734]: I1206 00:34:06.732005 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-klfrp_b6fc283a-61b9-4920-90d7-2636375a958b/frr/0.log" Dec 06 00:34:20 crc kubenswrapper[4734]: I1206 00:34:20.614646 4734 scope.go:117] "RemoveContainer" containerID="defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d" Dec 06 00:34:20 crc kubenswrapper[4734]: E1206 00:34:20.615673 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:34:21 crc kubenswrapper[4734]: I1206 00:34:21.227346 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5_44479d90-68b0-4428-b667-5c5c8bbebf2e/util/0.log" Dec 06 00:34:21 crc kubenswrapper[4734]: I1206 00:34:21.437663 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5_44479d90-68b0-4428-b667-5c5c8bbebf2e/pull/0.log" Dec 06 00:34:21 crc kubenswrapper[4734]: I1206 00:34:21.495653 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5_44479d90-68b0-4428-b667-5c5c8bbebf2e/util/0.log" Dec 06 00:34:21 crc kubenswrapper[4734]: I1206 00:34:21.496052 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5_44479d90-68b0-4428-b667-5c5c8bbebf2e/pull/0.log" Dec 06 00:34:21 crc kubenswrapper[4734]: I1206 00:34:21.740275 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5_44479d90-68b0-4428-b667-5c5c8bbebf2e/util/0.log" Dec 06 00:34:21 crc kubenswrapper[4734]: I1206 00:34:21.759591 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5_44479d90-68b0-4428-b667-5c5c8bbebf2e/pull/0.log" Dec 06 00:34:21 crc kubenswrapper[4734]: I1206 00:34:21.770235 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fhf7j5_44479d90-68b0-4428-b667-5c5c8bbebf2e/extract/0.log" Dec 06 00:34:21 crc kubenswrapper[4734]: I1206 00:34:21.928458 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6_13194382-29bc-40a1-8f25-9566b13ad6ae/util/0.log" Dec 06 00:34:22 crc kubenswrapper[4734]: I1206 00:34:22.255443 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6_13194382-29bc-40a1-8f25-9566b13ad6ae/pull/0.log" Dec 06 00:34:22 crc kubenswrapper[4734]: I1206 00:34:22.265749 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6_13194382-29bc-40a1-8f25-9566b13ad6ae/util/0.log" Dec 06 00:34:22 crc kubenswrapper[4734]: I1206 00:34:22.267388 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6_13194382-29bc-40a1-8f25-9566b13ad6ae/pull/0.log" Dec 06 00:34:22 crc kubenswrapper[4734]: I1206 00:34:22.456266 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6_13194382-29bc-40a1-8f25-9566b13ad6ae/util/0.log" Dec 06 00:34:22 crc kubenswrapper[4734]: I1206 00:34:22.499992 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6_13194382-29bc-40a1-8f25-9566b13ad6ae/extract/0.log" Dec 06 00:34:22 crc kubenswrapper[4734]: I1206 00:34:22.527191 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rhtb6_13194382-29bc-40a1-8f25-9566b13ad6ae/pull/0.log" Dec 06 00:34:22 crc kubenswrapper[4734]: I1206 00:34:22.702288 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zvcrv_03642c63-70ad-48c4-9fa1-c0e8d2d0d067/extract-utilities/0.log" Dec 06 00:34:22 crc kubenswrapper[4734]: I1206 00:34:22.947403 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zvcrv_03642c63-70ad-48c4-9fa1-c0e8d2d0d067/extract-utilities/0.log" Dec 06 00:34:22 crc kubenswrapper[4734]: I1206 00:34:22.948488 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zvcrv_03642c63-70ad-48c4-9fa1-c0e8d2d0d067/extract-content/0.log" Dec 06 00:34:23 crc kubenswrapper[4734]: I1206 00:34:23.013187 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zvcrv_03642c63-70ad-48c4-9fa1-c0e8d2d0d067/extract-content/0.log" Dec 06 00:34:23 crc kubenswrapper[4734]: I1206 00:34:23.207363 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zvcrv_03642c63-70ad-48c4-9fa1-c0e8d2d0d067/extract-utilities/0.log" Dec 06 00:34:23 crc kubenswrapper[4734]: I1206 00:34:23.256015 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zvcrv_03642c63-70ad-48c4-9fa1-c0e8d2d0d067/extract-content/0.log" Dec 06 00:34:23 crc kubenswrapper[4734]: I1206 00:34:23.490354 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rxk7p_1d3776cd-4682-4c8b-94e2-73bc8c1ee60e/extract-utilities/0.log" Dec 06 00:34:23 crc kubenswrapper[4734]: I1206 00:34:23.721256 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rxk7p_1d3776cd-4682-4c8b-94e2-73bc8c1ee60e/extract-content/0.log" Dec 06 00:34:23 crc kubenswrapper[4734]: I1206 00:34:23.731413 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rxk7p_1d3776cd-4682-4c8b-94e2-73bc8c1ee60e/extract-utilities/0.log" Dec 06 00:34:23 crc kubenswrapper[4734]: I1206 00:34:23.777351 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rxk7p_1d3776cd-4682-4c8b-94e2-73bc8c1ee60e/extract-content/0.log" Dec 06 00:34:23 crc kubenswrapper[4734]: I1206 00:34:23.857904 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zvcrv_03642c63-70ad-48c4-9fa1-c0e8d2d0d067/registry-server/0.log" Dec 06 00:34:23 crc kubenswrapper[4734]: I1206 00:34:23.995415 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rxk7p_1d3776cd-4682-4c8b-94e2-73bc8c1ee60e/extract-content/0.log" Dec 06 00:34:24 crc kubenswrapper[4734]: I1206 00:34:24.162414 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rxk7p_1d3776cd-4682-4c8b-94e2-73bc8c1ee60e/extract-utilities/0.log" Dec 06 00:34:24 crc kubenswrapper[4734]: I1206 00:34:24.575915 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-zddtm_8cccd0a8-35c0-4e22-b73c-bc9282c804b6/marketplace-operator/0.log" Dec 06 00:34:24 crc kubenswrapper[4734]: I1206 00:34:24.741964 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rxk7p_1d3776cd-4682-4c8b-94e2-73bc8c1ee60e/registry-server/0.log" Dec 06 00:34:24 crc kubenswrapper[4734]: I1206 00:34:24.745783 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fxgbf_52d0416a-4b26-4c76-8296-f279ad8c4158/extract-utilities/0.log" Dec 06 00:34:25 crc kubenswrapper[4734]: I1206 00:34:25.530659 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fxgbf_52d0416a-4b26-4c76-8296-f279ad8c4158/extract-content/0.log" Dec 06 00:34:25 crc kubenswrapper[4734]: I1206 00:34:25.538142 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fxgbf_52d0416a-4b26-4c76-8296-f279ad8c4158/extract-utilities/0.log" Dec 06 00:34:25 crc kubenswrapper[4734]: I1206 00:34:25.599743 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fxgbf_52d0416a-4b26-4c76-8296-f279ad8c4158/extract-content/0.log" Dec 06 00:34:25 crc kubenswrapper[4734]: I1206 00:34:25.841177 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fxgbf_52d0416a-4b26-4c76-8296-f279ad8c4158/extract-utilities/0.log" Dec 06 00:34:25 crc kubenswrapper[4734]: I1206 00:34:25.849835 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fxgbf_52d0416a-4b26-4c76-8296-f279ad8c4158/extract-content/0.log" Dec 06 00:34:25 crc kubenswrapper[4734]: I1206 00:34:25.900124 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bspnj_9e82acab-ae84-48a7-83bd-7f83a96e3f7f/extract-utilities/0.log" Dec 06 00:34:26 crc kubenswrapper[4734]: I1206 00:34:26.042111 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fxgbf_52d0416a-4b26-4c76-8296-f279ad8c4158/registry-server/0.log" Dec 06 00:34:26 crc kubenswrapper[4734]: I1206 00:34:26.214773 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bspnj_9e82acab-ae84-48a7-83bd-7f83a96e3f7f/extract-content/0.log" Dec 06 00:34:26 crc kubenswrapper[4734]: I1206 00:34:26.214801 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bspnj_9e82acab-ae84-48a7-83bd-7f83a96e3f7f/extract-content/0.log" Dec 06 00:34:26 crc kubenswrapper[4734]: I1206 00:34:26.247984 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bspnj_9e82acab-ae84-48a7-83bd-7f83a96e3f7f/extract-utilities/0.log" Dec 06 00:34:26 crc kubenswrapper[4734]: I1206 00:34:26.427893 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bspnj_9e82acab-ae84-48a7-83bd-7f83a96e3f7f/extract-utilities/0.log" Dec 06 00:34:26 crc kubenswrapper[4734]: I1206 00:34:26.479481 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bspnj_9e82acab-ae84-48a7-83bd-7f83a96e3f7f/extract-content/0.log" Dec 06 00:34:27 crc kubenswrapper[4734]: I1206 00:34:27.122560 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bspnj_9e82acab-ae84-48a7-83bd-7f83a96e3f7f/registry-server/0.log" Dec 06 00:34:34 crc kubenswrapper[4734]: I1206 00:34:34.614640 4734 scope.go:117] "RemoveContainer" containerID="defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d" Dec 06 00:34:34 crc kubenswrapper[4734]: E1206 00:34:34.615750 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:34:46 crc kubenswrapper[4734]: I1206 00:34:46.621844 4734 scope.go:117] "RemoveContainer" containerID="defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d" Dec 06 00:34:46 crc kubenswrapper[4734]: E1206 00:34:46.622780 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:34:57 crc kubenswrapper[4734]: I1206 00:34:57.615416 4734 scope.go:117] "RemoveContainer" containerID="defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d" Dec 06 00:34:57 crc kubenswrapper[4734]: E1206 00:34:57.616679 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:35:01 crc kubenswrapper[4734]: E1206 00:35:01.715340 4734 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.38:43018->38.102.83.38:44725: write tcp 38.102.83.38:43018->38.102.83.38:44725: write: broken pipe Dec 06 00:35:06 crc kubenswrapper[4734]: E1206 00:35:06.347779 4734 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.38:45068->38.102.83.38:44725: read tcp 38.102.83.38:45068->38.102.83.38:44725: read: connection reset by peer Dec 06 00:35:10 crc kubenswrapper[4734]: I1206 00:35:10.614648 4734 scope.go:117] "RemoveContainer" containerID="defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d" Dec 06 00:35:10 crc kubenswrapper[4734]: E1206 00:35:10.615620 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:35:22 crc kubenswrapper[4734]: I1206 00:35:22.615542 4734 scope.go:117] "RemoveContainer" containerID="defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d" Dec 06 00:35:22 crc kubenswrapper[4734]: E1206 00:35:22.617320 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:35:33 crc kubenswrapper[4734]: I1206 00:35:33.615852 4734 scope.go:117] "RemoveContainer" containerID="defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d" Dec 06 00:35:33 crc kubenswrapper[4734]: E1206 00:35:33.617061 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:35:35 crc kubenswrapper[4734]: I1206 00:35:35.887832 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p7hx7"] Dec 06 00:35:35 crc kubenswrapper[4734]: E1206 00:35:35.888806 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3b374c0-9943-4407-a015-6f596bc75fe1" containerName="registry-server" Dec 06 00:35:35 crc kubenswrapper[4734]: I1206 00:35:35.888821 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3b374c0-9943-4407-a015-6f596bc75fe1" containerName="registry-server" Dec 06 00:35:35 crc kubenswrapper[4734]: E1206 00:35:35.888837 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3b374c0-9943-4407-a015-6f596bc75fe1" containerName="extract-content" Dec 06 00:35:35 crc kubenswrapper[4734]: I1206 00:35:35.888844 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3b374c0-9943-4407-a015-6f596bc75fe1" containerName="extract-content" Dec 06 00:35:35 crc kubenswrapper[4734]: E1206 00:35:35.888880 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af200ff3-145c-4b87-9184-9bcad46ae2f0" containerName="container-00" Dec 06 00:35:35 crc kubenswrapper[4734]: I1206 00:35:35.888888 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="af200ff3-145c-4b87-9184-9bcad46ae2f0" containerName="container-00" Dec 06 00:35:35 crc kubenswrapper[4734]: E1206 00:35:35.888907 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3b374c0-9943-4407-a015-6f596bc75fe1" containerName="extract-utilities" Dec 06 00:35:35 crc kubenswrapper[4734]: I1206 00:35:35.888913 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3b374c0-9943-4407-a015-6f596bc75fe1" containerName="extract-utilities" Dec 06 00:35:35 crc kubenswrapper[4734]: I1206 00:35:35.889100 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="af200ff3-145c-4b87-9184-9bcad46ae2f0" containerName="container-00" Dec 06 00:35:35 crc kubenswrapper[4734]: I1206 00:35:35.889118 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3b374c0-9943-4407-a015-6f596bc75fe1" containerName="registry-server" Dec 06 00:35:35 crc kubenswrapper[4734]: I1206 00:35:35.890555 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7hx7" Dec 06 00:35:35 crc kubenswrapper[4734]: I1206 00:35:35.907624 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p7hx7"] Dec 06 00:35:36 crc kubenswrapper[4734]: I1206 00:35:36.028714 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/470d388f-edee-4562-994c-cd1e2744d0a6-utilities\") pod \"certified-operators-p7hx7\" (UID: \"470d388f-edee-4562-994c-cd1e2744d0a6\") " pod="openshift-marketplace/certified-operators-p7hx7" Dec 06 00:35:36 crc kubenswrapper[4734]: I1206 00:35:36.028821 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/470d388f-edee-4562-994c-cd1e2744d0a6-catalog-content\") pod \"certified-operators-p7hx7\" (UID: \"470d388f-edee-4562-994c-cd1e2744d0a6\") " pod="openshift-marketplace/certified-operators-p7hx7" Dec 06 00:35:36 crc kubenswrapper[4734]: I1206 00:35:36.028860 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9p7d\" (UniqueName: \"kubernetes.io/projected/470d388f-edee-4562-994c-cd1e2744d0a6-kube-api-access-j9p7d\") pod \"certified-operators-p7hx7\" (UID: \"470d388f-edee-4562-994c-cd1e2744d0a6\") " pod="openshift-marketplace/certified-operators-p7hx7" Dec 06 00:35:36 crc kubenswrapper[4734]: I1206 00:35:36.130695 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/470d388f-edee-4562-994c-cd1e2744d0a6-catalog-content\") pod \"certified-operators-p7hx7\" (UID: \"470d388f-edee-4562-994c-cd1e2744d0a6\") " pod="openshift-marketplace/certified-operators-p7hx7" Dec 06 00:35:36 crc kubenswrapper[4734]: I1206 00:35:36.130779 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9p7d\" (UniqueName: \"kubernetes.io/projected/470d388f-edee-4562-994c-cd1e2744d0a6-kube-api-access-j9p7d\") pod \"certified-operators-p7hx7\" (UID: \"470d388f-edee-4562-994c-cd1e2744d0a6\") " pod="openshift-marketplace/certified-operators-p7hx7" Dec 06 00:35:36 crc kubenswrapper[4734]: I1206 00:35:36.130957 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/470d388f-edee-4562-994c-cd1e2744d0a6-utilities\") pod \"certified-operators-p7hx7\" (UID: \"470d388f-edee-4562-994c-cd1e2744d0a6\") " pod="openshift-marketplace/certified-operators-p7hx7" Dec 06 00:35:36 crc kubenswrapper[4734]: I1206 00:35:36.131595 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/470d388f-edee-4562-994c-cd1e2744d0a6-catalog-content\") pod \"certified-operators-p7hx7\" (UID: \"470d388f-edee-4562-994c-cd1e2744d0a6\") " pod="openshift-marketplace/certified-operators-p7hx7" Dec 06 00:35:36 crc kubenswrapper[4734]: I1206 00:35:36.131610 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/470d388f-edee-4562-994c-cd1e2744d0a6-utilities\") pod \"certified-operators-p7hx7\" (UID: \"470d388f-edee-4562-994c-cd1e2744d0a6\") " pod="openshift-marketplace/certified-operators-p7hx7" Dec 06 00:35:36 crc kubenswrapper[4734]: I1206 00:35:36.160157 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9p7d\" (UniqueName: \"kubernetes.io/projected/470d388f-edee-4562-994c-cd1e2744d0a6-kube-api-access-j9p7d\") pod \"certified-operators-p7hx7\" (UID: \"470d388f-edee-4562-994c-cd1e2744d0a6\") " pod="openshift-marketplace/certified-operators-p7hx7" Dec 06 00:35:36 crc kubenswrapper[4734]: I1206 00:35:36.223886 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7hx7" Dec 06 00:35:36 crc kubenswrapper[4734]: I1206 00:35:36.786114 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p7hx7"] Dec 06 00:35:37 crc kubenswrapper[4734]: I1206 00:35:37.480409 4734 generic.go:334] "Generic (PLEG): container finished" podID="470d388f-edee-4562-994c-cd1e2744d0a6" containerID="dd62bdfbe39d579587c8b9d5f0d837e10eb96188d99d425a39900cf836eb1ded" exitCode=0 Dec 06 00:35:37 crc kubenswrapper[4734]: I1206 00:35:37.480571 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7hx7" event={"ID":"470d388f-edee-4562-994c-cd1e2744d0a6","Type":"ContainerDied","Data":"dd62bdfbe39d579587c8b9d5f0d837e10eb96188d99d425a39900cf836eb1ded"} Dec 06 00:35:37 crc kubenswrapper[4734]: I1206 00:35:37.480864 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7hx7" event={"ID":"470d388f-edee-4562-994c-cd1e2744d0a6","Type":"ContainerStarted","Data":"573144972f0c20fba4a172a2a691dc6b4821bb7e9acc2cc7032feaa2b65729de"} Dec 06 00:35:37 crc kubenswrapper[4734]: I1206 00:35:37.486258 4734 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 06 00:35:38 crc kubenswrapper[4734]: I1206 00:35:38.495950 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7hx7" event={"ID":"470d388f-edee-4562-994c-cd1e2744d0a6","Type":"ContainerStarted","Data":"a1fa46a660b66f6888e5402fa2e5e57f38a536f01bc7712e97f69689e896e074"} Dec 06 00:35:39 crc kubenswrapper[4734]: I1206 00:35:39.509470 4734 generic.go:334] "Generic (PLEG): container finished" podID="470d388f-edee-4562-994c-cd1e2744d0a6" containerID="a1fa46a660b66f6888e5402fa2e5e57f38a536f01bc7712e97f69689e896e074" exitCode=0 Dec 06 00:35:39 crc kubenswrapper[4734]: I1206 00:35:39.509557 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7hx7" event={"ID":"470d388f-edee-4562-994c-cd1e2744d0a6","Type":"ContainerDied","Data":"a1fa46a660b66f6888e5402fa2e5e57f38a536f01bc7712e97f69689e896e074"} Dec 06 00:35:40 crc kubenswrapper[4734]: I1206 00:35:40.522686 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7hx7" event={"ID":"470d388f-edee-4562-994c-cd1e2744d0a6","Type":"ContainerStarted","Data":"92f9cc3b2357813104fcac05a40731a9875438da1571c7e751016aaf35f03224"} Dec 06 00:35:40 crc kubenswrapper[4734]: I1206 00:35:40.548017 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p7hx7" podStartSLOduration=3.110861016 podStartE2EDuration="5.547991821s" podCreationTimestamp="2025-12-06 00:35:35 +0000 UTC" firstStartedPulling="2025-12-06 00:35:37.483277422 +0000 UTC m=+4558.166681698" lastFinishedPulling="2025-12-06 00:35:39.920408227 +0000 UTC m=+4560.603812503" observedRunningTime="2025-12-06 00:35:40.540732312 +0000 UTC m=+4561.224136588" watchObservedRunningTime="2025-12-06 00:35:40.547991821 +0000 UTC m=+4561.231396097" Dec 06 00:35:46 crc kubenswrapper[4734]: I1206 00:35:46.224313 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p7hx7" Dec 06 00:35:46 crc kubenswrapper[4734]: I1206 00:35:46.225208 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p7hx7" Dec 06 00:35:46 crc kubenswrapper[4734]: I1206 00:35:46.280349 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p7hx7" Dec 06 00:35:46 crc kubenswrapper[4734]: I1206 00:35:46.614358 4734 scope.go:117] "RemoveContainer" containerID="defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d" Dec 06 00:35:46 crc kubenswrapper[4734]: E1206 00:35:46.615469 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:35:47 crc kubenswrapper[4734]: I1206 00:35:47.244496 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p7hx7" Dec 06 00:35:47 crc kubenswrapper[4734]: I1206 00:35:47.319003 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p7hx7"] Dec 06 00:35:48 crc kubenswrapper[4734]: I1206 00:35:48.621147 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p7hx7" podUID="470d388f-edee-4562-994c-cd1e2744d0a6" containerName="registry-server" containerID="cri-o://92f9cc3b2357813104fcac05a40731a9875438da1571c7e751016aaf35f03224" gracePeriod=2 Dec 06 00:35:49 crc kubenswrapper[4734]: I1206 00:35:49.633392 4734 generic.go:334] "Generic (PLEG): container finished" podID="470d388f-edee-4562-994c-cd1e2744d0a6" containerID="92f9cc3b2357813104fcac05a40731a9875438da1571c7e751016aaf35f03224" exitCode=0 Dec 06 00:35:49 crc kubenswrapper[4734]: I1206 00:35:49.633569 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7hx7" event={"ID":"470d388f-edee-4562-994c-cd1e2744d0a6","Type":"ContainerDied","Data":"92f9cc3b2357813104fcac05a40731a9875438da1571c7e751016aaf35f03224"} Dec 06 00:35:50 crc kubenswrapper[4734]: I1206 00:35:50.244913 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7hx7" Dec 06 00:35:50 crc kubenswrapper[4734]: I1206 00:35:50.385814 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/470d388f-edee-4562-994c-cd1e2744d0a6-utilities\") pod \"470d388f-edee-4562-994c-cd1e2744d0a6\" (UID: \"470d388f-edee-4562-994c-cd1e2744d0a6\") " Dec 06 00:35:50 crc kubenswrapper[4734]: I1206 00:35:50.385968 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/470d388f-edee-4562-994c-cd1e2744d0a6-catalog-content\") pod \"470d388f-edee-4562-994c-cd1e2744d0a6\" (UID: \"470d388f-edee-4562-994c-cd1e2744d0a6\") " Dec 06 00:35:50 crc kubenswrapper[4734]: I1206 00:35:50.386170 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9p7d\" (UniqueName: \"kubernetes.io/projected/470d388f-edee-4562-994c-cd1e2744d0a6-kube-api-access-j9p7d\") pod \"470d388f-edee-4562-994c-cd1e2744d0a6\" (UID: \"470d388f-edee-4562-994c-cd1e2744d0a6\") " Dec 06 00:35:50 crc kubenswrapper[4734]: I1206 00:35:50.389585 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/470d388f-edee-4562-994c-cd1e2744d0a6-utilities" (OuterVolumeSpecName: "utilities") pod "470d388f-edee-4562-994c-cd1e2744d0a6" (UID: "470d388f-edee-4562-994c-cd1e2744d0a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:35:50 crc kubenswrapper[4734]: I1206 00:35:50.405176 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/470d388f-edee-4562-994c-cd1e2744d0a6-kube-api-access-j9p7d" (OuterVolumeSpecName: "kube-api-access-j9p7d") pod "470d388f-edee-4562-994c-cd1e2744d0a6" (UID: "470d388f-edee-4562-994c-cd1e2744d0a6"). InnerVolumeSpecName "kube-api-access-j9p7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:35:50 crc kubenswrapper[4734]: I1206 00:35:50.447010 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/470d388f-edee-4562-994c-cd1e2744d0a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "470d388f-edee-4562-994c-cd1e2744d0a6" (UID: "470d388f-edee-4562-994c-cd1e2744d0a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:35:50 crc kubenswrapper[4734]: I1206 00:35:50.489116 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/470d388f-edee-4562-994c-cd1e2744d0a6-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 00:35:50 crc kubenswrapper[4734]: I1206 00:35:50.489143 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/470d388f-edee-4562-994c-cd1e2744d0a6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 00:35:50 crc kubenswrapper[4734]: I1206 00:35:50.489153 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9p7d\" (UniqueName: \"kubernetes.io/projected/470d388f-edee-4562-994c-cd1e2744d0a6-kube-api-access-j9p7d\") on node \"crc\" DevicePath \"\"" Dec 06 00:35:50 crc kubenswrapper[4734]: I1206 00:35:50.647861 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7hx7" event={"ID":"470d388f-edee-4562-994c-cd1e2744d0a6","Type":"ContainerDied","Data":"573144972f0c20fba4a172a2a691dc6b4821bb7e9acc2cc7032feaa2b65729de"} Dec 06 00:35:50 crc kubenswrapper[4734]: I1206 00:35:50.647953 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7hx7" Dec 06 00:35:50 crc kubenswrapper[4734]: I1206 00:35:50.648011 4734 scope.go:117] "RemoveContainer" containerID="92f9cc3b2357813104fcac05a40731a9875438da1571c7e751016aaf35f03224" Dec 06 00:35:50 crc kubenswrapper[4734]: I1206 00:35:50.691091 4734 scope.go:117] "RemoveContainer" containerID="a1fa46a660b66f6888e5402fa2e5e57f38a536f01bc7712e97f69689e896e074" Dec 06 00:35:50 crc kubenswrapper[4734]: I1206 00:35:50.701447 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p7hx7"] Dec 06 00:35:50 crc kubenswrapper[4734]: I1206 00:35:50.716446 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p7hx7"] Dec 06 00:35:50 crc kubenswrapper[4734]: I1206 00:35:50.723114 4734 scope.go:117] "RemoveContainer" containerID="dd62bdfbe39d579587c8b9d5f0d837e10eb96188d99d425a39900cf836eb1ded" Dec 06 00:35:51 crc kubenswrapper[4734]: I1206 00:35:51.627494 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="470d388f-edee-4562-994c-cd1e2744d0a6" path="/var/lib/kubelet/pods/470d388f-edee-4562-994c-cd1e2744d0a6/volumes" Dec 06 00:35:58 crc kubenswrapper[4734]: I1206 00:35:58.614996 4734 scope.go:117] "RemoveContainer" containerID="defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d" Dec 06 00:35:58 crc kubenswrapper[4734]: E1206 00:35:58.616102 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:36:10 crc kubenswrapper[4734]: I1206 00:36:10.614690 4734 scope.go:117] "RemoveContainer" containerID="defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d" Dec 06 00:36:10 crc kubenswrapper[4734]: E1206 00:36:10.615652 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:36:24 crc kubenswrapper[4734]: I1206 00:36:24.615068 4734 scope.go:117] "RemoveContainer" containerID="defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d" Dec 06 00:36:24 crc kubenswrapper[4734]: E1206 00:36:24.616388 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:36:28 crc kubenswrapper[4734]: I1206 00:36:28.095015 4734 generic.go:334] "Generic (PLEG): container finished" podID="7437b2ff-0f14-41c8-a613-f583ed483d0b" containerID="1558380f7380e17cc5d139b70624a126d116b0e81ff5b2fcb94ffaa10c496388" exitCode=0 Dec 06 00:36:28 crc kubenswrapper[4734]: I1206 00:36:28.095104 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5wnj2/must-gather-27zj4" event={"ID":"7437b2ff-0f14-41c8-a613-f583ed483d0b","Type":"ContainerDied","Data":"1558380f7380e17cc5d139b70624a126d116b0e81ff5b2fcb94ffaa10c496388"} Dec 06 00:36:28 crc kubenswrapper[4734]: I1206 00:36:28.096653 4734 scope.go:117] "RemoveContainer" containerID="1558380f7380e17cc5d139b70624a126d116b0e81ff5b2fcb94ffaa10c496388" Dec 06 00:36:28 crc kubenswrapper[4734]: I1206 00:36:28.743438 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5wnj2_must-gather-27zj4_7437b2ff-0f14-41c8-a613-f583ed483d0b/gather/0.log" Dec 06 00:36:38 crc kubenswrapper[4734]: I1206 00:36:38.614633 4734 scope.go:117] "RemoveContainer" containerID="defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d" Dec 06 00:36:38 crc kubenswrapper[4734]: E1206 00:36:38.615702 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:36:39 crc kubenswrapper[4734]: I1206 00:36:39.103319 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5wnj2/must-gather-27zj4"] Dec 06 00:36:39 crc kubenswrapper[4734]: I1206 00:36:39.103673 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-5wnj2/must-gather-27zj4" podUID="7437b2ff-0f14-41c8-a613-f583ed483d0b" containerName="copy" containerID="cri-o://b640db05c6941c22238bfe8ece15168ed1e93842773a8339239cbeb90a50c2b0" gracePeriod=2 Dec 06 00:36:39 crc kubenswrapper[4734]: I1206 00:36:39.116342 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5wnj2/must-gather-27zj4"] Dec 06 00:36:39 crc kubenswrapper[4734]: I1206 00:36:39.246940 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5wnj2_must-gather-27zj4_7437b2ff-0f14-41c8-a613-f583ed483d0b/copy/0.log" Dec 06 00:36:39 crc kubenswrapper[4734]: I1206 00:36:39.247292 4734 generic.go:334] "Generic (PLEG): container finished" podID="7437b2ff-0f14-41c8-a613-f583ed483d0b" containerID="b640db05c6941c22238bfe8ece15168ed1e93842773a8339239cbeb90a50c2b0" exitCode=143 Dec 06 00:36:40 crc kubenswrapper[4734]: I1206 00:36:40.263254 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5wnj2_must-gather-27zj4_7437b2ff-0f14-41c8-a613-f583ed483d0b/copy/0.log" Dec 06 00:36:40 crc kubenswrapper[4734]: I1206 00:36:40.265100 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5wnj2/must-gather-27zj4" Dec 06 00:36:40 crc kubenswrapper[4734]: I1206 00:36:40.265365 4734 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5wnj2_must-gather-27zj4_7437b2ff-0f14-41c8-a613-f583ed483d0b/copy/0.log" Dec 06 00:36:40 crc kubenswrapper[4734]: I1206 00:36:40.265873 4734 scope.go:117] "RemoveContainer" containerID="b640db05c6941c22238bfe8ece15168ed1e93842773a8339239cbeb90a50c2b0" Dec 06 00:36:40 crc kubenswrapper[4734]: I1206 00:36:40.308210 4734 scope.go:117] "RemoveContainer" containerID="1558380f7380e17cc5d139b70624a126d116b0e81ff5b2fcb94ffaa10c496388" Dec 06 00:36:40 crc kubenswrapper[4734]: I1206 00:36:40.389933 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7437b2ff-0f14-41c8-a613-f583ed483d0b-must-gather-output\") pod \"7437b2ff-0f14-41c8-a613-f583ed483d0b\" (UID: \"7437b2ff-0f14-41c8-a613-f583ed483d0b\") " Dec 06 00:36:40 crc kubenswrapper[4734]: I1206 00:36:40.390181 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvw5g\" (UniqueName: \"kubernetes.io/projected/7437b2ff-0f14-41c8-a613-f583ed483d0b-kube-api-access-bvw5g\") pod \"7437b2ff-0f14-41c8-a613-f583ed483d0b\" (UID: \"7437b2ff-0f14-41c8-a613-f583ed483d0b\") " Dec 06 00:36:40 crc kubenswrapper[4734]: I1206 00:36:40.397373 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7437b2ff-0f14-41c8-a613-f583ed483d0b-kube-api-access-bvw5g" (OuterVolumeSpecName: "kube-api-access-bvw5g") pod "7437b2ff-0f14-41c8-a613-f583ed483d0b" (UID: "7437b2ff-0f14-41c8-a613-f583ed483d0b"). InnerVolumeSpecName "kube-api-access-bvw5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:36:40 crc kubenswrapper[4734]: I1206 00:36:40.493826 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvw5g\" (UniqueName: \"kubernetes.io/projected/7437b2ff-0f14-41c8-a613-f583ed483d0b-kube-api-access-bvw5g\") on node \"crc\" DevicePath \"\"" Dec 06 00:36:40 crc kubenswrapper[4734]: I1206 00:36:40.552346 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7437b2ff-0f14-41c8-a613-f583ed483d0b-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "7437b2ff-0f14-41c8-a613-f583ed483d0b" (UID: "7437b2ff-0f14-41c8-a613-f583ed483d0b"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:36:40 crc kubenswrapper[4734]: I1206 00:36:40.598640 4734 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7437b2ff-0f14-41c8-a613-f583ed483d0b-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 06 00:36:41 crc kubenswrapper[4734]: I1206 00:36:41.277663 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5wnj2/must-gather-27zj4" Dec 06 00:36:41 crc kubenswrapper[4734]: I1206 00:36:41.630512 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7437b2ff-0f14-41c8-a613-f583ed483d0b" path="/var/lib/kubelet/pods/7437b2ff-0f14-41c8-a613-f583ed483d0b/volumes" Dec 06 00:36:53 crc kubenswrapper[4734]: I1206 00:36:53.615076 4734 scope.go:117] "RemoveContainer" containerID="defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d" Dec 06 00:36:53 crc kubenswrapper[4734]: E1206 00:36:53.616680 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:36:57 crc kubenswrapper[4734]: I1206 00:36:57.834392 4734 scope.go:117] "RemoveContainer" containerID="9fddd84121db8cbae67843a342f39fc1374895f1481946815a993459c87bd8f5" Dec 06 00:37:04 crc kubenswrapper[4734]: I1206 00:37:04.615054 4734 scope.go:117] "RemoveContainer" containerID="defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d" Dec 06 00:37:04 crc kubenswrapper[4734]: E1206 00:37:04.616111 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:37:18 crc kubenswrapper[4734]: I1206 00:37:18.614993 4734 scope.go:117] "RemoveContainer" containerID="defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d" Dec 06 00:37:18 crc kubenswrapper[4734]: E1206 00:37:18.616236 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:37:30 crc kubenswrapper[4734]: I1206 00:37:30.614599 4734 scope.go:117] "RemoveContainer" containerID="defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d" Dec 06 00:37:30 crc kubenswrapper[4734]: E1206 00:37:30.615575 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:37:38 crc kubenswrapper[4734]: I1206 00:37:38.752257 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nl68j"] Dec 06 00:37:38 crc kubenswrapper[4734]: E1206 00:37:38.753759 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7437b2ff-0f14-41c8-a613-f583ed483d0b" containerName="copy" Dec 06 00:37:38 crc kubenswrapper[4734]: I1206 00:37:38.753774 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="7437b2ff-0f14-41c8-a613-f583ed483d0b" containerName="copy" Dec 06 00:37:38 crc kubenswrapper[4734]: E1206 00:37:38.753789 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="470d388f-edee-4562-994c-cd1e2744d0a6" containerName="extract-utilities" Dec 06 00:37:38 crc kubenswrapper[4734]: I1206 00:37:38.753796 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="470d388f-edee-4562-994c-cd1e2744d0a6" containerName="extract-utilities" Dec 06 00:37:38 crc kubenswrapper[4734]: E1206 00:37:38.753817 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7437b2ff-0f14-41c8-a613-f583ed483d0b" containerName="gather" Dec 06 00:37:38 crc kubenswrapper[4734]: I1206 00:37:38.753824 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="7437b2ff-0f14-41c8-a613-f583ed483d0b" containerName="gather" Dec 06 00:37:38 crc kubenswrapper[4734]: E1206 00:37:38.753834 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="470d388f-edee-4562-994c-cd1e2744d0a6" containerName="extract-content" Dec 06 00:37:38 crc kubenswrapper[4734]: I1206 00:37:38.753839 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="470d388f-edee-4562-994c-cd1e2744d0a6" containerName="extract-content" Dec 06 00:37:38 crc kubenswrapper[4734]: E1206 00:37:38.753868 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="470d388f-edee-4562-994c-cd1e2744d0a6" containerName="registry-server" Dec 06 00:37:38 crc kubenswrapper[4734]: I1206 00:37:38.753873 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="470d388f-edee-4562-994c-cd1e2744d0a6" containerName="registry-server" Dec 06 00:37:38 crc kubenswrapper[4734]: I1206 00:37:38.754068 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="470d388f-edee-4562-994c-cd1e2744d0a6" containerName="registry-server" Dec 06 00:37:38 crc kubenswrapper[4734]: I1206 00:37:38.754089 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="7437b2ff-0f14-41c8-a613-f583ed483d0b" containerName="gather" Dec 06 00:37:38 crc kubenswrapper[4734]: I1206 00:37:38.754108 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="7437b2ff-0f14-41c8-a613-f583ed483d0b" containerName="copy" Dec 06 00:37:38 crc kubenswrapper[4734]: I1206 00:37:38.755555 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nl68j" Dec 06 00:37:38 crc kubenswrapper[4734]: I1206 00:37:38.809258 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nl68j"] Dec 06 00:37:38 crc kubenswrapper[4734]: I1206 00:37:38.870992 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x49fk\" (UniqueName: \"kubernetes.io/projected/d6fa2616-1a90-4225-8190-5b1f95b1dc4d-kube-api-access-x49fk\") pod \"redhat-marketplace-nl68j\" (UID: \"d6fa2616-1a90-4225-8190-5b1f95b1dc4d\") " pod="openshift-marketplace/redhat-marketplace-nl68j" Dec 06 00:37:38 crc kubenswrapper[4734]: I1206 00:37:38.871269 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6fa2616-1a90-4225-8190-5b1f95b1dc4d-utilities\") pod \"redhat-marketplace-nl68j\" (UID: \"d6fa2616-1a90-4225-8190-5b1f95b1dc4d\") " pod="openshift-marketplace/redhat-marketplace-nl68j" Dec 06 00:37:38 crc kubenswrapper[4734]: I1206 00:37:38.871336 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6fa2616-1a90-4225-8190-5b1f95b1dc4d-catalog-content\") pod \"redhat-marketplace-nl68j\" (UID: \"d6fa2616-1a90-4225-8190-5b1f95b1dc4d\") " pod="openshift-marketplace/redhat-marketplace-nl68j" Dec 06 00:37:38 crc kubenswrapper[4734]: I1206 00:37:38.974419 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6fa2616-1a90-4225-8190-5b1f95b1dc4d-utilities\") pod \"redhat-marketplace-nl68j\" (UID: \"d6fa2616-1a90-4225-8190-5b1f95b1dc4d\") " pod="openshift-marketplace/redhat-marketplace-nl68j" Dec 06 00:37:38 crc kubenswrapper[4734]: I1206 00:37:38.974516 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6fa2616-1a90-4225-8190-5b1f95b1dc4d-catalog-content\") pod \"redhat-marketplace-nl68j\" (UID: \"d6fa2616-1a90-4225-8190-5b1f95b1dc4d\") " pod="openshift-marketplace/redhat-marketplace-nl68j" Dec 06 00:37:38 crc kubenswrapper[4734]: I1206 00:37:38.974570 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x49fk\" (UniqueName: \"kubernetes.io/projected/d6fa2616-1a90-4225-8190-5b1f95b1dc4d-kube-api-access-x49fk\") pod \"redhat-marketplace-nl68j\" (UID: \"d6fa2616-1a90-4225-8190-5b1f95b1dc4d\") " pod="openshift-marketplace/redhat-marketplace-nl68j" Dec 06 00:37:38 crc kubenswrapper[4734]: I1206 00:37:38.975165 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6fa2616-1a90-4225-8190-5b1f95b1dc4d-catalog-content\") pod \"redhat-marketplace-nl68j\" (UID: \"d6fa2616-1a90-4225-8190-5b1f95b1dc4d\") " pod="openshift-marketplace/redhat-marketplace-nl68j" Dec 06 00:37:38 crc kubenswrapper[4734]: I1206 00:37:38.975332 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6fa2616-1a90-4225-8190-5b1f95b1dc4d-utilities\") pod \"redhat-marketplace-nl68j\" (UID: \"d6fa2616-1a90-4225-8190-5b1f95b1dc4d\") " pod="openshift-marketplace/redhat-marketplace-nl68j" Dec 06 00:37:39 crc kubenswrapper[4734]: I1206 00:37:39.009626 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x49fk\" (UniqueName: \"kubernetes.io/projected/d6fa2616-1a90-4225-8190-5b1f95b1dc4d-kube-api-access-x49fk\") pod \"redhat-marketplace-nl68j\" (UID: \"d6fa2616-1a90-4225-8190-5b1f95b1dc4d\") " pod="openshift-marketplace/redhat-marketplace-nl68j" Dec 06 00:37:39 crc kubenswrapper[4734]: I1206 00:37:39.096003 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nl68j" Dec 06 00:37:39 crc kubenswrapper[4734]: I1206 00:37:39.640464 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nl68j"] Dec 06 00:37:39 crc kubenswrapper[4734]: I1206 00:37:39.905103 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nl68j" event={"ID":"d6fa2616-1a90-4225-8190-5b1f95b1dc4d","Type":"ContainerStarted","Data":"2da6e0f3996c98bc818c9373c9bda40f2c5e37b4436410572fff4c7477d50740"} Dec 06 00:37:39 crc kubenswrapper[4734]: I1206 00:37:39.905607 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nl68j" event={"ID":"d6fa2616-1a90-4225-8190-5b1f95b1dc4d","Type":"ContainerStarted","Data":"96514a5fc6920a85ad0c8df227019b74011893d2168e8859c4390650960a59ab"} Dec 06 00:37:40 crc kubenswrapper[4734]: I1206 00:37:40.920312 4734 generic.go:334] "Generic (PLEG): container finished" podID="d6fa2616-1a90-4225-8190-5b1f95b1dc4d" containerID="2da6e0f3996c98bc818c9373c9bda40f2c5e37b4436410572fff4c7477d50740" exitCode=0 Dec 06 00:37:40 crc kubenswrapper[4734]: I1206 00:37:40.920897 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nl68j" event={"ID":"d6fa2616-1a90-4225-8190-5b1f95b1dc4d","Type":"ContainerDied","Data":"2da6e0f3996c98bc818c9373c9bda40f2c5e37b4436410572fff4c7477d50740"} Dec 06 00:37:41 crc kubenswrapper[4734]: I1206 00:37:41.959105 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nl68j" event={"ID":"d6fa2616-1a90-4225-8190-5b1f95b1dc4d","Type":"ContainerStarted","Data":"0e8ade0278f43d079ab24d0e0c1ebf3527dcbd6616171894f286bb87b6124da4"} Dec 06 00:37:42 crc kubenswrapper[4734]: I1206 00:37:42.615473 4734 scope.go:117] "RemoveContainer" containerID="defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d" Dec 06 00:37:42 crc kubenswrapper[4734]: E1206 00:37:42.616448 4734 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vn94d_openshift-machine-config-operator(65758270-a7a7-46b5-af95-0588daf9fa86)\"" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" Dec 06 00:37:42 crc kubenswrapper[4734]: I1206 00:37:42.972081 4734 generic.go:334] "Generic (PLEG): container finished" podID="d6fa2616-1a90-4225-8190-5b1f95b1dc4d" containerID="0e8ade0278f43d079ab24d0e0c1ebf3527dcbd6616171894f286bb87b6124da4" exitCode=0 Dec 06 00:37:42 crc kubenswrapper[4734]: I1206 00:37:42.972139 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nl68j" event={"ID":"d6fa2616-1a90-4225-8190-5b1f95b1dc4d","Type":"ContainerDied","Data":"0e8ade0278f43d079ab24d0e0c1ebf3527dcbd6616171894f286bb87b6124da4"} Dec 06 00:37:44 crc kubenswrapper[4734]: I1206 00:37:44.995920 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nl68j" event={"ID":"d6fa2616-1a90-4225-8190-5b1f95b1dc4d","Type":"ContainerStarted","Data":"e6ef101a07410b60cf1589c31e993949975504f3c23162d05cad51174181538c"} Dec 06 00:37:45 crc kubenswrapper[4734]: I1206 00:37:45.036789 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nl68j" podStartSLOduration=4.577525113 podStartE2EDuration="7.036756951s" podCreationTimestamp="2025-12-06 00:37:38 +0000 UTC" firstStartedPulling="2025-12-06 00:37:40.92374627 +0000 UTC m=+4681.607150536" lastFinishedPulling="2025-12-06 00:37:43.382978098 +0000 UTC m=+4684.066382374" observedRunningTime="2025-12-06 00:37:45.028398415 +0000 UTC m=+4685.711802711" watchObservedRunningTime="2025-12-06 00:37:45.036756951 +0000 UTC m=+4685.720161227" Dec 06 00:37:49 crc kubenswrapper[4734]: I1206 00:37:49.097185 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nl68j" Dec 06 00:37:49 crc kubenswrapper[4734]: I1206 00:37:49.097943 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nl68j" Dec 06 00:37:49 crc kubenswrapper[4734]: I1206 00:37:49.164725 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nl68j" Dec 06 00:37:50 crc kubenswrapper[4734]: I1206 00:37:50.094713 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nl68j" Dec 06 00:37:50 crc kubenswrapper[4734]: I1206 00:37:50.158757 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nl68j"] Dec 06 00:37:52 crc kubenswrapper[4734]: I1206 00:37:52.077647 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nl68j" podUID="d6fa2616-1a90-4225-8190-5b1f95b1dc4d" containerName="registry-server" containerID="cri-o://e6ef101a07410b60cf1589c31e993949975504f3c23162d05cad51174181538c" gracePeriod=2 Dec 06 00:37:52 crc kubenswrapper[4734]: I1206 00:37:52.554368 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nl68j" Dec 06 00:37:52 crc kubenswrapper[4734]: I1206 00:37:52.625604 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x49fk\" (UniqueName: \"kubernetes.io/projected/d6fa2616-1a90-4225-8190-5b1f95b1dc4d-kube-api-access-x49fk\") pod \"d6fa2616-1a90-4225-8190-5b1f95b1dc4d\" (UID: \"d6fa2616-1a90-4225-8190-5b1f95b1dc4d\") " Dec 06 00:37:52 crc kubenswrapper[4734]: I1206 00:37:52.625686 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6fa2616-1a90-4225-8190-5b1f95b1dc4d-utilities\") pod \"d6fa2616-1a90-4225-8190-5b1f95b1dc4d\" (UID: \"d6fa2616-1a90-4225-8190-5b1f95b1dc4d\") " Dec 06 00:37:52 crc kubenswrapper[4734]: I1206 00:37:52.625834 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6fa2616-1a90-4225-8190-5b1f95b1dc4d-catalog-content\") pod \"d6fa2616-1a90-4225-8190-5b1f95b1dc4d\" (UID: \"d6fa2616-1a90-4225-8190-5b1f95b1dc4d\") " Dec 06 00:37:52 crc kubenswrapper[4734]: I1206 00:37:52.627071 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6fa2616-1a90-4225-8190-5b1f95b1dc4d-utilities" (OuterVolumeSpecName: "utilities") pod "d6fa2616-1a90-4225-8190-5b1f95b1dc4d" (UID: "d6fa2616-1a90-4225-8190-5b1f95b1dc4d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:37:52 crc kubenswrapper[4734]: I1206 00:37:52.635879 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6fa2616-1a90-4225-8190-5b1f95b1dc4d-kube-api-access-x49fk" (OuterVolumeSpecName: "kube-api-access-x49fk") pod "d6fa2616-1a90-4225-8190-5b1f95b1dc4d" (UID: "d6fa2616-1a90-4225-8190-5b1f95b1dc4d"). InnerVolumeSpecName "kube-api-access-x49fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:37:52 crc kubenswrapper[4734]: I1206 00:37:52.653209 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6fa2616-1a90-4225-8190-5b1f95b1dc4d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d6fa2616-1a90-4225-8190-5b1f95b1dc4d" (UID: "d6fa2616-1a90-4225-8190-5b1f95b1dc4d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:37:52 crc kubenswrapper[4734]: I1206 00:37:52.727080 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x49fk\" (UniqueName: \"kubernetes.io/projected/d6fa2616-1a90-4225-8190-5b1f95b1dc4d-kube-api-access-x49fk\") on node \"crc\" DevicePath \"\"" Dec 06 00:37:52 crc kubenswrapper[4734]: I1206 00:37:52.727295 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6fa2616-1a90-4225-8190-5b1f95b1dc4d-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 00:37:52 crc kubenswrapper[4734]: I1206 00:37:52.727364 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6fa2616-1a90-4225-8190-5b1f95b1dc4d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 00:37:53 crc kubenswrapper[4734]: I1206 00:37:53.094751 4734 generic.go:334] "Generic (PLEG): container finished" podID="d6fa2616-1a90-4225-8190-5b1f95b1dc4d" containerID="e6ef101a07410b60cf1589c31e993949975504f3c23162d05cad51174181538c" exitCode=0 Dec 06 00:37:53 crc kubenswrapper[4734]: I1206 00:37:53.094826 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nl68j" event={"ID":"d6fa2616-1a90-4225-8190-5b1f95b1dc4d","Type":"ContainerDied","Data":"e6ef101a07410b60cf1589c31e993949975504f3c23162d05cad51174181538c"} Dec 06 00:37:53 crc kubenswrapper[4734]: I1206 00:37:53.094873 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nl68j" event={"ID":"d6fa2616-1a90-4225-8190-5b1f95b1dc4d","Type":"ContainerDied","Data":"96514a5fc6920a85ad0c8df227019b74011893d2168e8859c4390650960a59ab"} Dec 06 00:37:53 crc kubenswrapper[4734]: I1206 00:37:53.094929 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nl68j" Dec 06 00:37:53 crc kubenswrapper[4734]: I1206 00:37:53.094951 4734 scope.go:117] "RemoveContainer" containerID="e6ef101a07410b60cf1589c31e993949975504f3c23162d05cad51174181538c" Dec 06 00:37:53 crc kubenswrapper[4734]: I1206 00:37:53.121765 4734 scope.go:117] "RemoveContainer" containerID="0e8ade0278f43d079ab24d0e0c1ebf3527dcbd6616171894f286bb87b6124da4" Dec 06 00:37:53 crc kubenswrapper[4734]: I1206 00:37:53.142473 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nl68j"] Dec 06 00:37:53 crc kubenswrapper[4734]: I1206 00:37:53.155364 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nl68j"] Dec 06 00:37:53 crc kubenswrapper[4734]: I1206 00:37:53.168434 4734 scope.go:117] "RemoveContainer" containerID="2da6e0f3996c98bc818c9373c9bda40f2c5e37b4436410572fff4c7477d50740" Dec 06 00:37:53 crc kubenswrapper[4734]: I1206 00:37:53.214731 4734 scope.go:117] "RemoveContainer" containerID="e6ef101a07410b60cf1589c31e993949975504f3c23162d05cad51174181538c" Dec 06 00:37:53 crc kubenswrapper[4734]: E1206 00:37:53.215317 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6ef101a07410b60cf1589c31e993949975504f3c23162d05cad51174181538c\": container with ID starting with e6ef101a07410b60cf1589c31e993949975504f3c23162d05cad51174181538c not found: ID does not exist" containerID="e6ef101a07410b60cf1589c31e993949975504f3c23162d05cad51174181538c" Dec 06 00:37:53 crc kubenswrapper[4734]: I1206 00:37:53.215370 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6ef101a07410b60cf1589c31e993949975504f3c23162d05cad51174181538c"} err="failed to get container status \"e6ef101a07410b60cf1589c31e993949975504f3c23162d05cad51174181538c\": rpc error: code = NotFound desc = could not find container \"e6ef101a07410b60cf1589c31e993949975504f3c23162d05cad51174181538c\": container with ID starting with e6ef101a07410b60cf1589c31e993949975504f3c23162d05cad51174181538c not found: ID does not exist" Dec 06 00:37:53 crc kubenswrapper[4734]: I1206 00:37:53.215405 4734 scope.go:117] "RemoveContainer" containerID="0e8ade0278f43d079ab24d0e0c1ebf3527dcbd6616171894f286bb87b6124da4" Dec 06 00:37:53 crc kubenswrapper[4734]: E1206 00:37:53.215769 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e8ade0278f43d079ab24d0e0c1ebf3527dcbd6616171894f286bb87b6124da4\": container with ID starting with 0e8ade0278f43d079ab24d0e0c1ebf3527dcbd6616171894f286bb87b6124da4 not found: ID does not exist" containerID="0e8ade0278f43d079ab24d0e0c1ebf3527dcbd6616171894f286bb87b6124da4" Dec 06 00:37:53 crc kubenswrapper[4734]: I1206 00:37:53.215786 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e8ade0278f43d079ab24d0e0c1ebf3527dcbd6616171894f286bb87b6124da4"} err="failed to get container status \"0e8ade0278f43d079ab24d0e0c1ebf3527dcbd6616171894f286bb87b6124da4\": rpc error: code = NotFound desc = could not find container \"0e8ade0278f43d079ab24d0e0c1ebf3527dcbd6616171894f286bb87b6124da4\": container with ID starting with 0e8ade0278f43d079ab24d0e0c1ebf3527dcbd6616171894f286bb87b6124da4 not found: ID does not exist" Dec 06 00:37:53 crc kubenswrapper[4734]: I1206 00:37:53.215804 4734 scope.go:117] "RemoveContainer" containerID="2da6e0f3996c98bc818c9373c9bda40f2c5e37b4436410572fff4c7477d50740" Dec 06 00:37:53 crc kubenswrapper[4734]: E1206 00:37:53.216142 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2da6e0f3996c98bc818c9373c9bda40f2c5e37b4436410572fff4c7477d50740\": container with ID starting with 2da6e0f3996c98bc818c9373c9bda40f2c5e37b4436410572fff4c7477d50740 not found: ID does not exist" containerID="2da6e0f3996c98bc818c9373c9bda40f2c5e37b4436410572fff4c7477d50740" Dec 06 00:37:53 crc kubenswrapper[4734]: I1206 00:37:53.216164 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2da6e0f3996c98bc818c9373c9bda40f2c5e37b4436410572fff4c7477d50740"} err="failed to get container status \"2da6e0f3996c98bc818c9373c9bda40f2c5e37b4436410572fff4c7477d50740\": rpc error: code = NotFound desc = could not find container \"2da6e0f3996c98bc818c9373c9bda40f2c5e37b4436410572fff4c7477d50740\": container with ID starting with 2da6e0f3996c98bc818c9373c9bda40f2c5e37b4436410572fff4c7477d50740 not found: ID does not exist" Dec 06 00:37:53 crc kubenswrapper[4734]: I1206 00:37:53.628454 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6fa2616-1a90-4225-8190-5b1f95b1dc4d" path="/var/lib/kubelet/pods/d6fa2616-1a90-4225-8190-5b1f95b1dc4d/volumes" Dec 06 00:37:54 crc kubenswrapper[4734]: I1206 00:37:54.614558 4734 scope.go:117] "RemoveContainer" containerID="defa7a915f6f81a237f6907715eb5c4650a4cd13fa52e345ad569553ddc1f24d" Dec 06 00:37:55 crc kubenswrapper[4734]: I1206 00:37:55.122884 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" event={"ID":"65758270-a7a7-46b5-af95-0588daf9fa86","Type":"ContainerStarted","Data":"9e012000c4d80721e5b2fec136cff2b62d5fa624fda3a027800be8c11495603a"} Dec 06 00:38:38 crc kubenswrapper[4734]: I1206 00:38:38.968857 4734 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z7c2j"] Dec 06 00:38:38 crc kubenswrapper[4734]: E1206 00:38:38.970231 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6fa2616-1a90-4225-8190-5b1f95b1dc4d" containerName="registry-server" Dec 06 00:38:38 crc kubenswrapper[4734]: I1206 00:38:38.970254 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6fa2616-1a90-4225-8190-5b1f95b1dc4d" containerName="registry-server" Dec 06 00:38:38 crc kubenswrapper[4734]: E1206 00:38:38.970299 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6fa2616-1a90-4225-8190-5b1f95b1dc4d" containerName="extract-utilities" Dec 06 00:38:38 crc kubenswrapper[4734]: I1206 00:38:38.970310 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6fa2616-1a90-4225-8190-5b1f95b1dc4d" containerName="extract-utilities" Dec 06 00:38:38 crc kubenswrapper[4734]: E1206 00:38:38.970356 4734 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6fa2616-1a90-4225-8190-5b1f95b1dc4d" containerName="extract-content" Dec 06 00:38:38 crc kubenswrapper[4734]: I1206 00:38:38.970365 4734 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6fa2616-1a90-4225-8190-5b1f95b1dc4d" containerName="extract-content" Dec 06 00:38:38 crc kubenswrapper[4734]: I1206 00:38:38.970705 4734 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6fa2616-1a90-4225-8190-5b1f95b1dc4d" containerName="registry-server" Dec 06 00:38:38 crc kubenswrapper[4734]: I1206 00:38:38.972766 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7c2j" Dec 06 00:38:39 crc kubenswrapper[4734]: I1206 00:38:39.008777 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z7c2j"] Dec 06 00:38:39 crc kubenswrapper[4734]: I1206 00:38:39.110466 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2a42a2a-fbcf-43c1-b94a-edc319bdcb03-catalog-content\") pod \"redhat-operators-z7c2j\" (UID: \"e2a42a2a-fbcf-43c1-b94a-edc319bdcb03\") " pod="openshift-marketplace/redhat-operators-z7c2j" Dec 06 00:38:39 crc kubenswrapper[4734]: I1206 00:38:39.110824 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2a42a2a-fbcf-43c1-b94a-edc319bdcb03-utilities\") pod \"redhat-operators-z7c2j\" (UID: \"e2a42a2a-fbcf-43c1-b94a-edc319bdcb03\") " pod="openshift-marketplace/redhat-operators-z7c2j" Dec 06 00:38:39 crc kubenswrapper[4734]: I1206 00:38:39.110880 4734 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vftj\" (UniqueName: \"kubernetes.io/projected/e2a42a2a-fbcf-43c1-b94a-edc319bdcb03-kube-api-access-5vftj\") pod \"redhat-operators-z7c2j\" (UID: \"e2a42a2a-fbcf-43c1-b94a-edc319bdcb03\") " pod="openshift-marketplace/redhat-operators-z7c2j" Dec 06 00:38:39 crc kubenswrapper[4734]: I1206 00:38:39.214230 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2a42a2a-fbcf-43c1-b94a-edc319bdcb03-catalog-content\") pod \"redhat-operators-z7c2j\" (UID: \"e2a42a2a-fbcf-43c1-b94a-edc319bdcb03\") " pod="openshift-marketplace/redhat-operators-z7c2j" Dec 06 00:38:39 crc kubenswrapper[4734]: I1206 00:38:39.214393 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2a42a2a-fbcf-43c1-b94a-edc319bdcb03-utilities\") pod \"redhat-operators-z7c2j\" (UID: \"e2a42a2a-fbcf-43c1-b94a-edc319bdcb03\") " pod="openshift-marketplace/redhat-operators-z7c2j" Dec 06 00:38:39 crc kubenswrapper[4734]: I1206 00:38:39.214419 4734 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vftj\" (UniqueName: \"kubernetes.io/projected/e2a42a2a-fbcf-43c1-b94a-edc319bdcb03-kube-api-access-5vftj\") pod \"redhat-operators-z7c2j\" (UID: \"e2a42a2a-fbcf-43c1-b94a-edc319bdcb03\") " pod="openshift-marketplace/redhat-operators-z7c2j" Dec 06 00:38:39 crc kubenswrapper[4734]: I1206 00:38:39.214906 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2a42a2a-fbcf-43c1-b94a-edc319bdcb03-catalog-content\") pod \"redhat-operators-z7c2j\" (UID: \"e2a42a2a-fbcf-43c1-b94a-edc319bdcb03\") " pod="openshift-marketplace/redhat-operators-z7c2j" Dec 06 00:38:39 crc kubenswrapper[4734]: I1206 00:38:39.215119 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2a42a2a-fbcf-43c1-b94a-edc319bdcb03-utilities\") pod \"redhat-operators-z7c2j\" (UID: \"e2a42a2a-fbcf-43c1-b94a-edc319bdcb03\") " pod="openshift-marketplace/redhat-operators-z7c2j" Dec 06 00:38:39 crc kubenswrapper[4734]: I1206 00:38:39.299779 4734 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vftj\" (UniqueName: \"kubernetes.io/projected/e2a42a2a-fbcf-43c1-b94a-edc319bdcb03-kube-api-access-5vftj\") pod \"redhat-operators-z7c2j\" (UID: \"e2a42a2a-fbcf-43c1-b94a-edc319bdcb03\") " pod="openshift-marketplace/redhat-operators-z7c2j" Dec 06 00:38:39 crc kubenswrapper[4734]: I1206 00:38:39.303646 4734 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7c2j" Dec 06 00:38:39 crc kubenswrapper[4734]: I1206 00:38:39.940029 4734 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z7c2j"] Dec 06 00:38:40 crc kubenswrapper[4734]: I1206 00:38:40.598780 4734 generic.go:334] "Generic (PLEG): container finished" podID="e2a42a2a-fbcf-43c1-b94a-edc319bdcb03" containerID="d1323684e198e2ed980a41de25bd8f711d8ebbb41e41d0107ffa16f1beb936b1" exitCode=0 Dec 06 00:38:40 crc kubenswrapper[4734]: I1206 00:38:40.598874 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7c2j" event={"ID":"e2a42a2a-fbcf-43c1-b94a-edc319bdcb03","Type":"ContainerDied","Data":"d1323684e198e2ed980a41de25bd8f711d8ebbb41e41d0107ffa16f1beb936b1"} Dec 06 00:38:40 crc kubenswrapper[4734]: I1206 00:38:40.599254 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7c2j" event={"ID":"e2a42a2a-fbcf-43c1-b94a-edc319bdcb03","Type":"ContainerStarted","Data":"188688d68c30851bc5593e6cd31b9f6748f233f7611982b567544c204c9e5474"} Dec 06 00:38:42 crc kubenswrapper[4734]: I1206 00:38:42.620193 4734 generic.go:334] "Generic (PLEG): container finished" podID="e2a42a2a-fbcf-43c1-b94a-edc319bdcb03" containerID="16eae59d7a138e36814fb0ee51d95560dfd8044263c66a2e58b24edf3edc4e6f" exitCode=0 Dec 06 00:38:42 crc kubenswrapper[4734]: I1206 00:38:42.620284 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7c2j" event={"ID":"e2a42a2a-fbcf-43c1-b94a-edc319bdcb03","Type":"ContainerDied","Data":"16eae59d7a138e36814fb0ee51d95560dfd8044263c66a2e58b24edf3edc4e6f"} Dec 06 00:38:43 crc kubenswrapper[4734]: I1206 00:38:43.632678 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7c2j" event={"ID":"e2a42a2a-fbcf-43c1-b94a-edc319bdcb03","Type":"ContainerStarted","Data":"8677a3d0e5f28e1d781e1bb22b84a15ed98f8ff23b83fd5243af610775f30ee0"} Dec 06 00:38:43 crc kubenswrapper[4734]: I1206 00:38:43.671048 4734 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z7c2j" podStartSLOduration=3.229148516 podStartE2EDuration="5.671020788s" podCreationTimestamp="2025-12-06 00:38:38 +0000 UTC" firstStartedPulling="2025-12-06 00:38:40.601677134 +0000 UTC m=+4741.285081410" lastFinishedPulling="2025-12-06 00:38:43.043549396 +0000 UTC m=+4743.726953682" observedRunningTime="2025-12-06 00:38:43.660400797 +0000 UTC m=+4744.343805073" watchObservedRunningTime="2025-12-06 00:38:43.671020788 +0000 UTC m=+4744.354425064" Dec 06 00:38:49 crc kubenswrapper[4734]: I1206 00:38:49.305097 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z7c2j" Dec 06 00:38:49 crc kubenswrapper[4734]: I1206 00:38:49.305939 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z7c2j" Dec 06 00:38:49 crc kubenswrapper[4734]: I1206 00:38:49.863707 4734 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z7c2j" Dec 06 00:38:49 crc kubenswrapper[4734]: I1206 00:38:49.933369 4734 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z7c2j" Dec 06 00:38:50 crc kubenswrapper[4734]: I1206 00:38:50.112367 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z7c2j"] Dec 06 00:38:51 crc kubenswrapper[4734]: I1206 00:38:51.794404 4734 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z7c2j" podUID="e2a42a2a-fbcf-43c1-b94a-edc319bdcb03" containerName="registry-server" containerID="cri-o://8677a3d0e5f28e1d781e1bb22b84a15ed98f8ff23b83fd5243af610775f30ee0" gracePeriod=2 Dec 06 00:38:52 crc kubenswrapper[4734]: I1206 00:38:52.274294 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7c2j" Dec 06 00:38:52 crc kubenswrapper[4734]: I1206 00:38:52.362706 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2a42a2a-fbcf-43c1-b94a-edc319bdcb03-utilities\") pod \"e2a42a2a-fbcf-43c1-b94a-edc319bdcb03\" (UID: \"e2a42a2a-fbcf-43c1-b94a-edc319bdcb03\") " Dec 06 00:38:52 crc kubenswrapper[4734]: I1206 00:38:52.362954 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2a42a2a-fbcf-43c1-b94a-edc319bdcb03-catalog-content\") pod \"e2a42a2a-fbcf-43c1-b94a-edc319bdcb03\" (UID: \"e2a42a2a-fbcf-43c1-b94a-edc319bdcb03\") " Dec 06 00:38:52 crc kubenswrapper[4734]: I1206 00:38:52.362983 4734 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vftj\" (UniqueName: \"kubernetes.io/projected/e2a42a2a-fbcf-43c1-b94a-edc319bdcb03-kube-api-access-5vftj\") pod \"e2a42a2a-fbcf-43c1-b94a-edc319bdcb03\" (UID: \"e2a42a2a-fbcf-43c1-b94a-edc319bdcb03\") " Dec 06 00:38:52 crc kubenswrapper[4734]: I1206 00:38:52.363971 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2a42a2a-fbcf-43c1-b94a-edc319bdcb03-utilities" (OuterVolumeSpecName: "utilities") pod "e2a42a2a-fbcf-43c1-b94a-edc319bdcb03" (UID: "e2a42a2a-fbcf-43c1-b94a-edc319bdcb03"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:38:52 crc kubenswrapper[4734]: I1206 00:38:52.370591 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2a42a2a-fbcf-43c1-b94a-edc319bdcb03-kube-api-access-5vftj" (OuterVolumeSpecName: "kube-api-access-5vftj") pod "e2a42a2a-fbcf-43c1-b94a-edc319bdcb03" (UID: "e2a42a2a-fbcf-43c1-b94a-edc319bdcb03"). InnerVolumeSpecName "kube-api-access-5vftj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 06 00:38:52 crc kubenswrapper[4734]: I1206 00:38:52.465494 4734 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vftj\" (UniqueName: \"kubernetes.io/projected/e2a42a2a-fbcf-43c1-b94a-edc319bdcb03-kube-api-access-5vftj\") on node \"crc\" DevicePath \"\"" Dec 06 00:38:52 crc kubenswrapper[4734]: I1206 00:38:52.465554 4734 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2a42a2a-fbcf-43c1-b94a-edc319bdcb03-utilities\") on node \"crc\" DevicePath \"\"" Dec 06 00:38:52 crc kubenswrapper[4734]: I1206 00:38:52.815837 4734 generic.go:334] "Generic (PLEG): container finished" podID="e2a42a2a-fbcf-43c1-b94a-edc319bdcb03" containerID="8677a3d0e5f28e1d781e1bb22b84a15ed98f8ff23b83fd5243af610775f30ee0" exitCode=0 Dec 06 00:38:52 crc kubenswrapper[4734]: I1206 00:38:52.815930 4734 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7c2j" Dec 06 00:38:52 crc kubenswrapper[4734]: I1206 00:38:52.815919 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7c2j" event={"ID":"e2a42a2a-fbcf-43c1-b94a-edc319bdcb03","Type":"ContainerDied","Data":"8677a3d0e5f28e1d781e1bb22b84a15ed98f8ff23b83fd5243af610775f30ee0"} Dec 06 00:38:52 crc kubenswrapper[4734]: I1206 00:38:52.818127 4734 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7c2j" event={"ID":"e2a42a2a-fbcf-43c1-b94a-edc319bdcb03","Type":"ContainerDied","Data":"188688d68c30851bc5593e6cd31b9f6748f233f7611982b567544c204c9e5474"} Dec 06 00:38:52 crc kubenswrapper[4734]: I1206 00:38:52.818183 4734 scope.go:117] "RemoveContainer" containerID="8677a3d0e5f28e1d781e1bb22b84a15ed98f8ff23b83fd5243af610775f30ee0" Dec 06 00:38:52 crc kubenswrapper[4734]: I1206 00:38:52.852380 4734 scope.go:117] "RemoveContainer" containerID="16eae59d7a138e36814fb0ee51d95560dfd8044263c66a2e58b24edf3edc4e6f" Dec 06 00:38:52 crc kubenswrapper[4734]: I1206 00:38:52.898451 4734 scope.go:117] "RemoveContainer" containerID="d1323684e198e2ed980a41de25bd8f711d8ebbb41e41d0107ffa16f1beb936b1" Dec 06 00:38:52 crc kubenswrapper[4734]: I1206 00:38:52.945356 4734 scope.go:117] "RemoveContainer" containerID="8677a3d0e5f28e1d781e1bb22b84a15ed98f8ff23b83fd5243af610775f30ee0" Dec 06 00:38:52 crc kubenswrapper[4734]: E1206 00:38:52.945809 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8677a3d0e5f28e1d781e1bb22b84a15ed98f8ff23b83fd5243af610775f30ee0\": container with ID starting with 8677a3d0e5f28e1d781e1bb22b84a15ed98f8ff23b83fd5243af610775f30ee0 not found: ID does not exist" containerID="8677a3d0e5f28e1d781e1bb22b84a15ed98f8ff23b83fd5243af610775f30ee0" Dec 06 00:38:52 crc kubenswrapper[4734]: I1206 00:38:52.945860 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8677a3d0e5f28e1d781e1bb22b84a15ed98f8ff23b83fd5243af610775f30ee0"} err="failed to get container status \"8677a3d0e5f28e1d781e1bb22b84a15ed98f8ff23b83fd5243af610775f30ee0\": rpc error: code = NotFound desc = could not find container \"8677a3d0e5f28e1d781e1bb22b84a15ed98f8ff23b83fd5243af610775f30ee0\": container with ID starting with 8677a3d0e5f28e1d781e1bb22b84a15ed98f8ff23b83fd5243af610775f30ee0 not found: ID does not exist" Dec 06 00:38:52 crc kubenswrapper[4734]: I1206 00:38:52.945894 4734 scope.go:117] "RemoveContainer" containerID="16eae59d7a138e36814fb0ee51d95560dfd8044263c66a2e58b24edf3edc4e6f" Dec 06 00:38:52 crc kubenswrapper[4734]: E1206 00:38:52.946121 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16eae59d7a138e36814fb0ee51d95560dfd8044263c66a2e58b24edf3edc4e6f\": container with ID starting with 16eae59d7a138e36814fb0ee51d95560dfd8044263c66a2e58b24edf3edc4e6f not found: ID does not exist" containerID="16eae59d7a138e36814fb0ee51d95560dfd8044263c66a2e58b24edf3edc4e6f" Dec 06 00:38:52 crc kubenswrapper[4734]: I1206 00:38:52.946151 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16eae59d7a138e36814fb0ee51d95560dfd8044263c66a2e58b24edf3edc4e6f"} err="failed to get container status \"16eae59d7a138e36814fb0ee51d95560dfd8044263c66a2e58b24edf3edc4e6f\": rpc error: code = NotFound desc = could not find container \"16eae59d7a138e36814fb0ee51d95560dfd8044263c66a2e58b24edf3edc4e6f\": container with ID starting with 16eae59d7a138e36814fb0ee51d95560dfd8044263c66a2e58b24edf3edc4e6f not found: ID does not exist" Dec 06 00:38:52 crc kubenswrapper[4734]: I1206 00:38:52.946169 4734 scope.go:117] "RemoveContainer" containerID="d1323684e198e2ed980a41de25bd8f711d8ebbb41e41d0107ffa16f1beb936b1" Dec 06 00:38:52 crc kubenswrapper[4734]: E1206 00:38:52.946394 4734 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1323684e198e2ed980a41de25bd8f711d8ebbb41e41d0107ffa16f1beb936b1\": container with ID starting with d1323684e198e2ed980a41de25bd8f711d8ebbb41e41d0107ffa16f1beb936b1 not found: ID does not exist" containerID="d1323684e198e2ed980a41de25bd8f711d8ebbb41e41d0107ffa16f1beb936b1" Dec 06 00:38:52 crc kubenswrapper[4734]: I1206 00:38:52.946423 4734 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1323684e198e2ed980a41de25bd8f711d8ebbb41e41d0107ffa16f1beb936b1"} err="failed to get container status \"d1323684e198e2ed980a41de25bd8f711d8ebbb41e41d0107ffa16f1beb936b1\": rpc error: code = NotFound desc = could not find container \"d1323684e198e2ed980a41de25bd8f711d8ebbb41e41d0107ffa16f1beb936b1\": container with ID starting with d1323684e198e2ed980a41de25bd8f711d8ebbb41e41d0107ffa16f1beb936b1 not found: ID does not exist" Dec 06 00:38:53 crc kubenswrapper[4734]: I1206 00:38:53.601042 4734 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2a42a2a-fbcf-43c1-b94a-edc319bdcb03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2a42a2a-fbcf-43c1-b94a-edc319bdcb03" (UID: "e2a42a2a-fbcf-43c1-b94a-edc319bdcb03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 06 00:38:53 crc kubenswrapper[4734]: I1206 00:38:53.692971 4734 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2a42a2a-fbcf-43c1-b94a-edc319bdcb03-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 06 00:38:53 crc kubenswrapper[4734]: I1206 00:38:53.749199 4734 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z7c2j"] Dec 06 00:38:53 crc kubenswrapper[4734]: I1206 00:38:53.760016 4734 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z7c2j"] Dec 06 00:38:55 crc kubenswrapper[4734]: I1206 00:38:55.629732 4734 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2a42a2a-fbcf-43c1-b94a-edc319bdcb03" path="/var/lib/kubelet/pods/e2a42a2a-fbcf-43c1-b94a-edc319bdcb03/volumes" Dec 06 00:40:20 crc kubenswrapper[4734]: I1206 00:40:20.444920 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 00:40:20 crc kubenswrapper[4734]: I1206 00:40:20.445643 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 06 00:40:50 crc kubenswrapper[4734]: I1206 00:40:50.445093 4734 patch_prober.go:28] interesting pod/machine-config-daemon-vn94d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 06 00:40:50 crc kubenswrapper[4734]: I1206 00:40:50.445889 4734 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vn94d" podUID="65758270-a7a7-46b5-af95-0588daf9fa86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"